WorldWideScience

Sample records for automated model abstraction

  1. Automated Predicate Abstraction for Real-Time Models

    Directory of Open Access Journals (Sweden)

    Bahareh Badban

    2009-11-01

    Full Text Available We present a technique designed to automatically compute predicate abstractions for dense real-timed models represented as networks of timed automata. We use the CIPM algorithm in our previous work which computes new invariants for timed automata control locations and prunes the model, to compute a predicate abstraction of the model. We do so by taking information regarding control locations and their newly computed invariants into account.

  2. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  3. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    Science.gov (United States)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  4. Abstract Models of Transfinite Reductions

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2010-01-01

    We investigate transfinite reductions in abstract reduction systems. To this end, we study two abstract models for transfinite reductions: a metric model generalising the usual metric approach to infinitary term rewriting and a novel partial order model. For both models we distinguish between...... a weak and a strong variant of convergence as known from infinitary term rewriting. Furthermore, we introduce an axiomatic model of reductions that is general enough to cover all of these models of transfinite reductions as well as the ordinary model of finite reductions. It is shown that...

  5. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  6. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  7. ABSTRACT MODELS FOR SYSTEM VIRTUALIZATION

    Directory of Open Access Journals (Sweden)

    M. G. Koveshnikov

    2015-05-01

    Full Text Available The paper is dedicated to issues of system objects securing (system files and user system or application configuration files against unauthorized access including denial of service attacks. We have suggested the method and developed abstract system virtualization models, which are used toresearch attack scenarios for different virtualization modes. Estimation for system tools virtualization technology effectiveness is given. Suggested technology is based on redirection of access requests to system objects shared among access subjects. Whole and partial system virtualization modes have been modeled. The difference between them is the following: in the whole virtualization mode all copies of access system objects are created whereon subjects’ requests are redirected including corresponding application objects;in the partial virtualization mode corresponding copies are created only for part of a system, for example, only system objects for applications. Alternative solutions effectiveness is valued relating to different attack scenarios. We consider proprietary and approved technical solution which implements system virtualization method for Microsoft Windows OS family. Administrative simplicity and capabilities of correspondingly designed system objects security tools are illustrated on this example. Practical significance of the suggested security method has been confirmed.

  8. Automated spatial and thematic generalization using a context transformation model: integrating steering parameters, classification and aggregation hierarchies, reduction factors, and topological structures for multiple abstractions.

    NARCIS (Netherlands)

    Richardson, D.E.

    1993-01-01

    This dissertation presents a model for spatial and thematic digital generalization. To do so, the development of digital generalization over the last thirty years is first reviewedThe approach to generalization taken in this research differs from other existing works as it tackles the task from a da

  9. Engineering Abstractions in Model Checking and Testing

    DEFF Research Database (Denmark)

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet...... and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implications of remaining weaknesses of these tools. We believe that a principled engineering approach to designing...... and implementing abstractions will improve the applicability of model checking in practice....

  10. An Abstraction Theory for Qualitative Models of Biological Systems

    CERN Document Server

    Banks, Richard; 10.4204/EPTCS.40.3

    2010-01-01

    Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.

  11. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Discussion on main hydraulic performance parameters' model acceptance test of mixed flow pump turbine YU Ji-xing, LI Jin-wei, CHEN Liu, REN Shao-cheng, JIANG Ming-li, LI Hai-ling (China Institute of Water Resources and Hydropower Research, Beijing 100038, China) Abstract: Model acceptance test content and main performance parameter' acceptance mode of mixed flow pump turbine were introduced, main hydraulic performance parameters, such as hump of pump high lift region and S unstable region during turbine starting, were discussed, after that, frequency characteristic of pressure fluctuation beside guide blade area were presented in briefly. Key words: pump turbine; model acceptance test; hump region; "S" region; pressure fluctuation

  12. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    Energy Technology Data Exchange (ETDEWEB)

    B.W. ARNOLD

    2004-10-27

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ.

  13. An Abstract Model of Historical Processes

    CERN Document Server

    Poulshock, Michael

    2016-01-01

    A game theoretic model is presented which attempts to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents play a dynamic, noncooperative, perfect information game where the goal is to maximize payoffs based on positional utility and intertemporal preference, while being constrained by social inertia. Agents use the power they have in order to get more of it, both in an absolute and relative sense. More research is needed to assess the model's empirical validity.

  14. Solicited abstract: Global hydrological modeling and models

    Science.gov (United States)

    Xu, Chong-Yu

    2010-05-01

    The origins of rainfall-runoff modeling in the broad sense can be found in the middle of the 19th century arising in response to three types of engineering problems: (1) urban sewer design, (2) land reclamation drainage systems design, and (3) reservoir spillway design. Since then numerous empirical, conceptual and physically-based models are developed including event based models using unit hydrograph concept, Nash's linear reservoir models, HBV model, TOPMODEL, SHE model, etc. From the late 1980s, the evolution of global and continental-scale hydrology has placed new demands on hydrologic modellers. The macro-scale hydrological (global and regional scale) models were developed on the basis of the following motivations (Arenll, 1999). First, for a variety of operational and planning purposes, water resource managers responsible for large regions need to estimate the spatial variability of resources over large areas, at a spatial resolution finer than can be provided by observed data alone. Second, hydrologists and water managers are interested in the effects of land-use and climate variability and change over a large geographic domain. Third, there is an increasing need of using hydrologic models as a base to estimate point and non-point sources of pollution loading to streams. Fourth, hydrologists and atmospheric modellers have perceived weaknesses in the representation of hydrological processes in regional and global climate models, and developed global hydrological models to overcome the weaknesses of global climate models. Considerable progress in the development and application of global hydrological models has been achieved to date, however, large uncertainties still exist considering the model structure including large scale flow routing, parameterization, input data, etc. This presentation will focus on the global hydrological models, and the discussion includes (1) types of global hydrological models, (2) procedure of global hydrological model development

  15. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The Western Theories of War Ethics and Contemporary Controversies Li Xiaodong U Ruijing (4) [ Abstract] In the field of international relations, war ethics is a concept with distinct westem ideological color. Due to factors of history and reality, the in

  16. Abstract

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cognitive Structure of Scientific Theory in the Scientist-Philosopher's Eyes、Two Theories of Scientific Abstraction Centered on Practices、Many-worlds Interpretation in Quantum Measurement and Its Meaning、Scientific Instrument: Paradigm Shift from Instrumentalism to Realism and Phenomenology

  17. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [ Abstract] Interaction between China and the international system has been a highlighted is- sue and drawing a great deal of attention all over the world. It has been approached from structural point of view and in a way of a conflicting pair of self and the other, which is the prevailing ontological perspective of IR studies. Contrary to it, processual

  18. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Abstract: The ethylene plant at SINOPEC Shanghai Petrochemical Company Limited ranked in the middle among SINOPEC subsidiaries in terms of ethylene and propylene yields, technical economical indicator and so on, and its performance ranking went no further in chemical sector. By means of feedstock optimization, steam optimization, and energy saving and consumption reduction, the company enhanced its competitiveness in the market and improved its efficiency. In addition, some ideas were put forward on performance improvement of the ethylene plant in the future.

  19. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [ Abstract ] The global resurgence of religion and the return of religion from the so-call "Westphalia Exile" to the central stage of international religions have significantly trans- formed the viewpoints of both media and academia toward the role of religion in IR, and the challenges posed by religion to the contemporary international relations are often described as entirely subversive. The author argues that as a second-tier factor in most countries' for- eign policies and international affairs,

  20. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    China' s Dual-identity Dilemma and Its Countermeasm'es Li Shaojun(4) [ Abstract] The international system, as the overall structure for interactions among actors, is the environment and stage for implementation of China' s foreign policy. In this system, identity is a fundamental factor determining China' s international position and interests, and how to achieve them. China has long stressed that it is a "developing country,"

  1. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The Research of the Subsistent States of the Artists Living in Guilin During the Anti-Japanese Period;A Brief History of Calligraphy Art of Chongqing as Provisional Capital during the Anti-Japanese Period;The Endless Stream and Terraced Mountains:on the Analysisof Spacein Chinese Landscape Painting of Gongxian;Political Vanguard in the Perspective View of the Formalism Aesthetic: 1980s Abstract Painting in China;Comparison of the Early-Stage Spreading of Western Pre-Modem Style Painting and Its Influence in China and Japan.

  2. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    (1) Lenin's "Unity of Three Dialectics": Notes Science of Logic and The Capital on Philosophy in the Dual Contexts of Sun Zhengyu 4 Lenin's dialectics in Notes on Philosophy is essentially a unity of materialistic logic, dialectics and epistemology that has arisen from interactions between Hegel' s Science of Logic and Marx' s The Capital. Due to a lack of understanding of Lenin' s "unity of three dialectics," people tend to misunderstand his dialectics for the meeting of two extremes of the "sum total of living instances" and "abstract methods,

  3. ABSTRACT

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    --Based on of Marx's Economic Philosophy Manuscripts of 1844 HE Jian-jin (Philosophy Department, Fujian Provincial Committee Party School, Fuzhou, Fujian 350012, China) Abstract: Socialism with Chinese characteristics has a close relationship with the return and growth of capital in China. To implement the scientific concept of development, we must confront the problem of scientifically controlling the capital. In Economic and Philosophical Manuscripts of 1844, Marx criticized the three old philosophical thinking of treating capital: Object-oriented thinking, intuitive thinking, purely spiritual abstract thinking, and he established his own unique understanding of the capital that is to understand the capital from the human perceptual activities and practical activities. Contemporary Chinese society exist the problem of underdevelopment and abnormal development, and the three heterogeneity problems of pre-modern, modern, postmodern concurrent. In order to implement the scientific concept of development, we must reject any abstract positive or negative to modern basic principles under the guidance of the capital, against the eternal capital theory and capital theory of evil, and we must oppose the thinking that the capital is eternal or evil. Key words: socialism with Chinese characteristics; capital; national economics; scientific concept of development

  4. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Shift of World Center and the Change of the International System Yan Xuetong Abstract: The power transition caused by China' s rise will not only change the international configuration, but could also result in the world cen- ter shifting from Europe to Asia. However, neither a change in the interna- tional configuration nor the shift of the world center implies a change of the type of the international system. The international system is composed of three elements: international actors, international configuration and interna- tional norms. It will be neither possible to distinguish international system from its components nor clarify relations between components and system if changes in one element are treated as change of types of international system.

  5. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On Rousseau's Equality and Freedom GONG Qun Abstract:Equality and freedom are the two core concepts of political philosophy and Rousseau~s political philosophy is no exception. Freedom and equality in Rousseau in- cludes two levels: natural state and social state under social contract, and among them, there is one state of un-equality. The relationship between the two concepts here is that equality is a necessary precondition of freedom, and that there is no equality, there is no freedom. The achievement of Rousseau~s equality is by one contractual behavior that all the members transfer their rights, especially property rights, and form of the Community. Freedom in Rousseau's mind is through the people's sovereignty in the Community to achieve freedom.

  6. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On the Construction of Water Conservancy during 1930s in Hubei Yu Tao ( 1 ) Abstract: Two extra-large floods in the 1930s had drawn the National Government's attention to introspect itself. After those disasters, the government had made some progress by a series of measures like repairing the dike, completing water conservancy institutions and enacting regulations in order to strengthen the water conservancy construction in Hubei. The government took water conservancy construction as a complex system project so that they had a relatively comprehensive consideration. It reflects the advancement of modern government to mobilize local people and put such social power into the unified planning. However, thelimitations of the government's policy implementation weakened the effect of water conservancy construction. Keywords: flood; government; water conservancy construction; effect

  7. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Spiritual Construction and Its Ideological Sources in Contemporary China The spiritual construction in contemporary China is an important ideological task proposed by the historical practice of China. Modernized development often entails the meaning of entering into " modern civilization. " Nevertheless, an abstract understanding of this civilization has covered up its essential stipulation and historical nature. China has pursued its development on a different historical prerequisite from the west, and therefore, only partially belongs to modern capitalist modernization. The practical prospects of Chinese development imply a transformation and remodeling of the general lifestyle, life attitudes and values, which inevitably calls for a new form of philosophy. The ideological sources for this new philosophy are: Chinese philosophy, Western integration of them may point to philosophy and Marxist philosophy. A creative a potentially new type of civilization.

  8. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    (1) ASEAN Security Community: Power Containment and Norm Construction --From the Perspective of Realist-Constructivism Zheng Ying-qin . 11 . Abstract:Combined of realism and constructivism, realist-constructivism emphasizes the interactions between power and identity through norms and their influence on international relations. From the perspective of realist-con- structivism, this article analyses the construction of ASEAN security community and figures out two main clues in the n of ASEAN : the containment of power system that facilitate the strategic cooperation of ASEAN and norms which promotes the cooperation. ASEAN has made achievements in security cooperation but security community. There are still lots of problems ASEAN needs to tackle, for example, to con- institution as well as to strengthen the regional economic interdependence. The possible way to pro- mote the ASEAN security community may start from the area of non-traditional security cooperation Key Wwords: ASEAN security community ; Realist-Constructivism ; power; norm

  9. Scalable Automated Model Search

    Science.gov (United States)

    2014-05-20

    nonlinearity into the models. A prototype built to run these experiments was constructed in Python, using the scikit.learn [33] library and numpy [6...hyperopt.github.io/hyperopt/. [6] Numpy . http://www.numpy.org/. [7] Random features for large scale machine learning. http://www.keysduplicated.com/~ali

  10. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Kong Jie, Li Shurong and Wang Weibo. ARMA model of MEMS acceleration detector and its characteristic analysis. PI, 2011, 25(5): 1 - 3 Impulse response datum of MEMS acceleration detector is got by the vibro-bench experiment. The ARMA model of this detector is built through the Steiglitz-McBride iteration method with impulse response datum. And the orders of ARMA model is evaluated using Hankel matrix rank. The simulation shows that the ARMA model built is very accurate. In the end, the amplitude frequency characteristics of this detector are analyzed according to the ARMA model.

  11. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Discussions of Design Highlights for Tailgas Treatment in Sulphuric Acid Plant Using New Technology for Flue Gas Desulfurization Through Catalytic Reduction LI Xin , CAO Long-wen , YIN Hua-qiang , El Yue-li , LI Jian-iun ( 1 ,College of Architecture and Environment, Sichuan University, Chengdu 610065, China;2 ,Daye Nonferrous Metals Co., Ltd., Huangshi 435000, China; 3 ,Tile Sixth Construction Company Ltd. of China National Chemical Engineering Corp., Xiangfan 441021, China) Abstract : For the present situation of tailgas treatment in current sulphuric acid plants and existing problems with commonly used technologies, the fun- damental working principle, process flow and reference project for a new technology for flue gas desulfurization through catalytic redaction which is used for tailgas treatment in a sulphuric acid plant and recovery of sulphur resource are outlined. The design highlights of this technology are analyzed and the are proposed. Compared to conventional technologies, this new technology offers high desulfurization efficiency and unique technology, which can effectively tackle the difficuhies of tailgas treatment in sulphuric acid plants after enforcement of the new standard. This new technology is thought to be significant economic benefit, environmental benefit, as well as a promising future of application.

  12. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Strategic Realism: An Option for China' s Grand Strategy Song Dexing (4) [ Abstract] As a non-Western emerging power, China should positively adapt its grand strategy to the strategic psychological traits in the 21st century, maintain a realist tone consistent with the national conditions of China, and avoid adventurist policies while awaring both strategic strength and weakness. In the 21st century, China' s grand strategy should be based on such core values as security, development, peace and justice, especially focusing on development in particular, which we named "strategic realism". Given the profound changes in China and the world, strategic realism encourages active foreign policy to safe- guard the long-term national interests of China. Following the self-help logic and the fun- damental values of security and prosperity, strategic realism concerns national interests as its top-priority. It advocates smart use of power, and aims to achieve its objectives by optimizing both domestic and international conditions. From the perspective of diplomatic phi- losophy, strategic realism is not a summarization of concrete policies but a description of China' s grand strategy orientations in the new century. [ Key Words] China, grand strategy, strategic realism [ Author]Song Dexing, Professor, Ph.D. Supervisor, and Director of the Center for International Strategic Studies, University of International Studies of PLA.

  13. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Western Characteristics of the Pardims of International Studies in America:With the Huaxla System as a Counterexample Ye Zicheng (4)[ Abstract ] Three flaws are obvious in the three paradigms of International Studies in America. Specifically, their arguments are based on the assumption that the world is anarchic ; they go too far in employing the scientific and rational methodology; they pay little attention to the humans. Hence, the three paradigms of international studies in America aren' t necessarily useful for the explanation of China' s history and culture as well as its relations with the outside world. The Huaxia system, for example, is anarchic but also apparently hierarchical; the approach of pursuing security in understanding the rise of western powers may be meaningless, for the hegemony in the Huaxia System needn't worry about its security; the theory of power-balancing seemingly couldn' t explain why Qin ended up in defeating the alliance of the other six states in the Warring-states period. The Huaxia system is quite open, and has free movement of people, goods, and ideas. Some interstate regimes and institutions were formed through Huimeng (alliance-making) among states. However, this kind of limited and fragile interdependence and cooperation soon came to an end after the hegemonies of Qi, Jin and Wei. There does exit the identity problem among states in the Huaxia System, but this problem doesn't play such a great role as the constructivists expect it would.

  14. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [Abstract] The essay analyzed the action logic of hegemon with a power approach. Hegemony can be classified as benign or malignant. A benign hegemon should be pro- ductive, inclusive and maintain procedure justice when it uses its power. The power of hegemon can be categorized into two types: the hard power, which is the use of coer- cion and payment and can be measured by public products, and the soft power, which shows the ability to attract and co-opt and can be measured by the relationship-specific investments. The relationship between the input of public products and the relationship -specific investments is not positively correlative. Confusing with the public products and the soft power might lead to strategic misleading. A country rich in power re- sources should comply with the following principles if it wanted to improve its hard power and soft power: first, analyze the scope of the existing hegemon's soft power and avoid investing public products in the scope; second, maintain honesty in a long term and continue to increase others' benefits following the rule of neutral Pareto im- provement; third, provide both public goods and public bads; fourth, be more patient to obtain soft power. [ Key Words] hegemon, soft power, relationship-specific investment, strategic misleading [Authors]Feng Weijiang, Ph.D., Associate Professor, Institute of World Economics and Politics, Chinese Academy of Social Science; Yu Jieya, Master, PBC Shanghai Headquarters.

  15. Abstract

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Models of Reduction and Functionalism: Comments on Kim's Reductive Physicalism CHEN Xiaoping (Research Center of System Science and System Management, School of Public Administration, South China Normal University, Guangzhou, Guangdong, 510006) Abstra

  16. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Research of Theory & Method Miscible Flooding Well Test Model With Carbon Dioxide Injection and Its Pressure Analysis. 2011,20(4):1 -4 Zhu Jianwei, Shao Changjin, Liao Xinwei, Yang Zhenqing( China University of Petroleum (Beijing) ) Based on miscible flooding well test analysis theory with carbon dioxide injection, the diffusion model of mixture components when carbon dioxide miscible with the oil and variation law of the temperature and viscosity are analyzed,

  17. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Post-Western International System and the Rise of the East;Hegemonlc Dependence and the Logic in the Declining Asceudance of Leading Powers;Constructive Leadership and China's Diplomatic Transformation;The Bargaining Model of International Mediation Onset:A Quantitative Test;The Imnact of Gender Differences on National Military Expenditure

  18. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Based on engineering practice, this paper applies the finite element method to model the SAC test results of bolted endplate connection. The ductility capacity of a new extended bolted end-plate connection for industrial buildings and structures is analyzed, and the analysis results can be used in engineering design and the development of specifications concerned.

  19. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Ma Xiwu' s trial mode is a model of adjudication in the Shaanxi-Gansu-Ningxia Border Region, resulting from the joint forces of the border region' s specific wartime environment, local environment, the border region' s social transformation and judicial reform as well as many other factors.

  20. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Since labor resources are a key factor in economic development, the difference of their allocation efficiency has aggravated the imbalance of China's regional economic development. Based on the formula of per capital GDP, this paper constructs an interpretation model of demographic bonus containing influence factors of human capital and material capital accumulation. Subsequently, using data envelopment analysis model we have measured and decomposed relative efficiency of demographic bonus in each region of China during 2006-2010. We have also accounted for the technical efficiency and scale efficiency by using Tobit model. Empirical analysis shows that the relative efficiency of demographic bonus in the Eastern region is higher than those in the middle and west areas. The technical efficiency is mainly affected by education level, aggregation effect of labor force and its participation rate, with influence radios of 0.0102 and 0.0149. Meanwhile, The scale efficiency is mainly affected by education level, aggregation effect of labor force, industrial environment of capital accumulation and generation effect of material capital, with influence radios of 0.1549, 0.1234 and 0.0371. Finally, this paper puts forward suggestions related to the findings.

  1. Injecting Abstract Interpretations into Linear Cost Models

    Directory of Open Access Journals (Sweden)

    David Cachera

    2010-06-01

    Full Text Available We present a semantics based framework for analysing the quantitative behaviour of programs with regard to resource usage. We start from an operational semantics equipped with costs. The dioid structure of the set of costs allows for defining the quantitative semantics as a linear operator. We then present an abstraction technique inspired from abstract interpretation in order to effectively compute global cost information from the program. Abstraction has to take two distinct notions of order into account: the order on costs and the order on states. We show that our abstraction technique provides a correct approximation of the concrete cost computations.

  2. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    A Representative work that Vainly Attempts to Westernize China On Yang Ji.sheng's Paper "My View of Chinese Pattern" XU Chong-wenAbstract: Mr. Yang Ji-sheng calls economic connotation of Chinese pattern "market economy of power" with all sorts of drawbacks, it is to take the problems that Chinese model deliberately struggles with and even the objects must be resolutely eliminated as the parts of Chinese pattern, thus they are absolute nonsense; he boils down political connotation of Chinese pattern to "authority politics" of "thoroughly denying modem democratic system",

  3. ABSTRACT

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Discussion on the Training Model of Training the Nursing Humanistic Quality by Employing the College English Teaching Li Hongfeng (Nursing College of Zhengzhou University, Zhenghou, Herman, 450052) Read and Write Periodica,vol.8, No.11,27,2011(ISSN1672-1578,in,Chinese) Abstract: The college English, as one of the most important public humanities courses in the curriculum system of collegiate nursing education, its teaching has positive orientation of human values and strong function in humanistic quality education.

  4. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    STUDY ON THE NATURAL GAS ACCUMULATION IN LOW RELIEF STRUCTURE RESERVOIR AT MIDDLE PORTION OF CHARZHOU STEPS IN AMUDARY BASIN Abstract: This paper focuses on the Hydrocarbon accumulation in low relief structure based on the comprehensive studies of seismic data, regional stratum, sedimentary facies, abundance of hydrocarbon and the character of petroleum geology of the Karakul territory in Chardjou step of Amudarya Basin. A reservoir geological model was established and the Potential zones of oil and gas accumulation were recommended by means of analysis well data, seismic data and three existing low relief gas deposits in the Karakul.

  5. Abstract

    Directory of Open Access Journals (Sweden)

    Maria Jose Carvalho de Souza Domingues

    2003-01-01

    Full Text Available The practice of teaching, in actuality, shows the necessity of teachers and students coming together to form a behavior that is different from the traditional model of teaching. The unity formed from various types of knowledge and the relation between theory and practice show themselves to be fundamental. Starting in 2002, and in search of this unity, a project that hoped to unify the disciplines taught in the second semester of the course in Administration was implemented. During the semester, a single work sought to relate the theories studied with the reality of an organization. Each professor evaluated the works from the point of view of his discipline, as well as the presentation, in general, of the group. It can be affirmed that seeking to bring together various types of knowledge necessarily passes to a rethinking of the postures of teachers and students.

  6. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    STUDY ON THE FEATURES OF DIFFERENT REFLECTION IN TAHE OILFIELD PALEOCAVES RESERVOIR/Jianfeng Wang, Pei Jin, Xinhua, Li et al. Northwest Oilfield Company of SINOPEC, Urumqi, Xinfiang ,830011/Xinfiang ShiYou TianRan Qi ,2011,7 ( 3 ) : 1 - 5 Abstract:In This paper, in the light of Seismic migration section and tectonic analysis data, seismic echo styles of Tahe oilfield paleocaves reservoir in time migration are summed up; different reflection features are classified. In the meantime, the classification criteria of quantizing identification with reservoir seismic echo model is established. With those methods above, the paleocaves reservoirs forecasting degree and well arrangement ratio are improved, the risk of developmental drilling is reduced and efficiency is reduced. The high development efficiency in Ordovician paleocaves reservoir is fulfilled. Key Words : Tahe Oilfield ; Paleocaves Reservoir; Reflection features ; Quantizing identification ; Reservoirs forecasting

  7. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Research on the Theory and Standard of Peasants" Life Cycle Pension Compensation Mu Huaizhong Shen Yi· 2 · Thedifficulties of full coverage in pension system lie in rural farmers. In this paper, we put forward a "dual agricultural welfare difference" theory and apply it to the issues regarding peasants' life cycle pension compensation. Taking differential between equilib- rium and biased agricultural incomes as the key indicator, we build mathematical models of "dual agricultural welfare balance" and measure the size from 1953 to 2009. Our finding shows that China's "dual agricultnral welfare difference" has a fluctuation ranged be- tween 0.4 and 0.6. Based on life cycle characteristics, such as natural life cycle, policy and institutional life cycle, our suggestion is to compensate peasants' primary pension with a balance of "dual agriculture welfare difference" and other countermeasures.

  8. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Analysis of importance of TG Water CultureQuest Yang Bin Fan Lingling ( 1 ) Abstract: The awareness of quest constitutes the essence and demonstrates the basic characteristic of the water culture in the Three Gorges areas. The legend of King Yu controlling floods by dredging the Three Gorges marked the budding of this spirit. Qu Yuan fully demonstrated the essence of the spirit by a frequently quoted saying, "yet high and low I will search with my will unbending", "I will not regret a thousand death to die". The inscription of White Crane Ridge displayed the ancient fashion of the spirit. The construction of the Three Gorges Project interpreted the spirit of exploring of the Three Gorges water culture. The local higher education institution, with its motto "quest", can be taken as a model in carrying forward the spirit of"constant exploration". Key words: awareness of quest; the Three Gorges; water culture

  9. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Fold distribution, offset distribution and azi- muth distribution in a bin have direct effects on ge- ometry attributes in 3D seismic survey design. If two adjacent bins have the same fold but different offsets, this non-uniform offset distribution brings some stack amplitude diversity in the adjacent bins. At present, 3D geometry attribute uniformi- ty of the most analytical methods is expressed by qualitative analysis chart. We introduce in this pa- per a uniformity quantitative analysis method for offset distribution by average value, square devia- tion and weighting factor. The paper analyses effects on offset distribution uniformity of different 3D geometry parameters by the proposed uniformi- ty quantitative analysis method. Furthermore, the paper analyses effects on seismic stack amplitude or frequency uniformity of different 3D geometry parameters by the seismic wave modeling. The re- suits show that offset distribution uniformity is in good agreement with seismic stack amplitude or frequency uniformity. Therefore this improved method can be considered as a useful tool for ana- lyzing and evaluating 3D geometry attribute uni- formity.

  10. Performance modeling of automated manufacturing systems

    Science.gov (United States)

    Viswanadham, N.; Narahari, Y.

    A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.

  11. Hidden Markov Model Based Automated Fault Localization for Integration Testing

    OpenAIRE

    Ge, Ning; NAKAJIMA, SHIN; Pantel, Marc

    2013-01-01

    International audience; Integration testing is an expensive activity in software testing, especially for fault localization in complex systems. Model-based diagnosis (MBD) provides various benefits in terms of scalability and robustness. In this work, we propose a novel MBD approach for the automated fault localization in integration testing. Our method is based on Hidden Markov Model (HMM) which is an abstraction of system's component to simulate component's behaviour. The core of this metho...

  12. AN ABSTRACT RELATIONAL MODEL AND NATURAL JOIN FUNCTORS

    OpenAIRE

    Kato, Akihiko

    1983-01-01

    A meta-model for database models called an abstract relational model which is obtained by a categorical abstraction of a relational model is proposed. This meta-model represents various database models, e.g. relational, network, hierarchical models as special cases. It is proved that a natural join is the right adjoint of a decomposition in the relational model. On the other hand, in our abstract relational model a natural join is defined as the right adjoint of a decomposition. A sufficient ...

  13. A Model-Driven Parser Generator, from Abstract Syntax Trees to Abstract Syntax Graphs

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2012-01-01

    Model-based parser generators decouple language specification from language processing. The model-driven approach avoids the limitations that conventional parser generators impose on the language designer. Conventional tools require the designed language grammar to conform to the specific kind of grammar supported by the particular parser generator (being LL and LR parser generators the most common). Model-driven parser generators, like ModelCC, do not require a grammar specification, since that grammar can be automatically derived from the language model and, if needed, adapted to conform to the requirements of the given kind of parser, all of this without interfering with the conceptual design of the language and its associated applications. Moreover, model-driven tools such as ModelCC are able to automatically resolve references between language elements, hence producing abstract syntax graphs instead of abstract syntax trees as the result of the parsing process. Such graphs are not confined to directed ac...

  14. Abstract Stobjs and Their Application to ISA Modeling

    Directory of Open Access Journals (Sweden)

    Shilpi Goel

    2013-04-01

    Full Text Available We introduce a new ACL2 feature, the abstract stobj, and show how to apply it to modeling the instruction set architecture of a microprocessor. Benefits of abstract stobjs over traditional ("concrete'' stobjs can include faster execution, support for symbolic simulation, more efficient reasoning, and resilience of proof developments under modeling optimization.

  15. An Abstraction-Based Data Model for Information Retrieval

    Science.gov (United States)

    McAllister, Richard A.; Angryk, Rafal A.

    Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.

  16. Demo abstract: Flexhouse-2-an open source building automation platform with a focus on flexible control

    DEFF Research Database (Denmark)

    Gehrke, Oliver; Kosek, Anna Magdalena; Svendsen, Mathias

    2014-01-01

    in or on buildings, and most of these resources will not be communicating directly with the smart grid; in order to allow internal coordination and optimization of resource use at building level, a building automation platform will act as an intermediary. Such a platform must be easy to adapt to the multitude......, an open-source implementation of a building automation system which has been designed with a strong focus on enabling the integration of the building into a smart power system and dedicated support for the requirements of an R&D environment. We will demonstrate the need for such a platform, discuss...

  17. Formal Modeling for Information Appliance Using Abstract MVC Architecture

    OpenAIRE

    Arichika, Yuji; Araki, Keijiro

    2004-01-01

    In information appliance development, it is important to divide core functions and display functions because information appliance have various user interface and display functions changed frequently. Using MVC architecture is one way to divide display functions and core functions. But MVC architecture is implementation architecture and there are some gaps to get abstract model. On the other hand it is known that formal methods are useful for constructing abstract model. Therefore we intend t...

  18. Coupling Radar Rainfall to Hydrological Models for Water Abstraction Management

    Science.gov (United States)

    Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; MacDonald, Ken

    2015-04-01

    The impacts of climate change and growing water use are likely to put considerable pressure on water resources and the environment. In the UK, a reform to surface water abstraction policy has recently been proposed which aims to increase the efficiency of using available water resources whilst minimising impacts on the aquatic environment. Key aspects to this reform include the consideration of dynamic rather than static abstraction licensing as well as introducing water trading concepts. Dynamic licensing will permit varying levels of abstraction dependent on environmental conditions (i.e. river flow and quality). The practical implementation of an effective dynamic abstraction strategy requires suitable flow forecasting techniques to inform abstraction asset management. Potentially the predicted availability of water resources within a catchment can be coupled to predicted demand and current storage to inform a cost effective water resource management strategy which minimises environmental impacts. The aim of this work is to use a historical analysis of UK case study catchment to compare potential water resource availability using modelled dynamic abstraction scenario informed by a flow forecasting model, against observed abstraction under a conventional abstraction regime. The work also demonstrates the impacts of modelling uncertainties on the accuracy of predicted water availability over range of forecast lead times. The study utilised a conceptual rainfall-runoff model PDM - Probability-Distributed Model developed by Centre for Ecology & Hydrology - set up in the Dove River catchment (UK) using 1km2 resolution radar rainfall as inputs and 15 min resolution gauged flow data for calibration and validation. Data assimilation procedures are implemented to improve flow predictions using observed flow data. Uncertainties in the radar rainfall data used in the model are quantified using artificial statistical error model described by Gaussian distribution and

  19. Task-focused modeling in automated agriculture

    Science.gov (United States)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  20. Efficient family-based model checking via variability abstractions

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus

    2016-01-01

    variational models using the standard version of (single-system) Spin. The variability abstractions are first defined as Galois connections on semantic domains. We then show how to use them for defining abstract family-based model checking, where a variability model is replaced with an abstract version of it......Many software systems are variational: they can be configured to meet diverse sets of requirements. They can produce a (potentially huge) number of related systems, known as products or variants, by systematically reusing common parts. For variational models (variational systems or families...... of related systems), specialized family-based model checking algorithms allow efficient verification of multiple variants, simultaneously, in a single run. These algorithms, implemented in a tool Snip, scale much better than ``the brute force'' approach, where all individual systems are verified using...

  1. Automated 3D model generation for urban environments [online

    OpenAIRE

    Frueh, Christian

    2007-01-01

    Abstract In this thesis, we present a fast approach to automated generation of textured 3D city models with both high details at ground level and complete coverage for bird’s-eye view. A ground-based facade model is acquired by driving a vehicle equipped with two 2D laser scanners and a digital camera under normal traffic conditions on public roads. One scanner is mounted horizontally and is used to determine the approximate component of relative motion along the move...

  2. Deterministic binary vectors for efficient automated indexing of MEDLINE/PubMed abstracts.

    Science.gov (United States)

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R; Bernstam, Elmer V; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI.

  3. Model checking abstract state machines with answer set programming

    OpenAIRE

    2006-01-01

    Answer Set Programming (ASP) is a logic programming paradigm that has been shown as a useful tool in various application areas due to its expressive modelling language. These application areas include Bourided Model Checking (BMC). BMC is a verification technique that is recognized for its strong ability of finding errors in computer systems. To apply BMC, a system needs to be modelled in a formal specification language, such as the widely used formalism of Abstract State Machines (ASMs). In ...

  4. Automated extraction of precise protein expression patterns in lymphoma by text mining abstracts of immunohistochemical studies

    Directory of Open Access Journals (Sweden)

    Jia-Fu Chang

    2013-01-01

    Full Text Available Background: In general, surgical pathology reviews report protein expression by tumors in a semi-quantitative manner, that is, -, -/+, +/-, +. At the same time, the experimental pathology literature provides multiple examples of precise expression levels determined by immunohistochemical (IHC tissue examination of populations of tumors. Natural language processing (NLP techniques enable the automated extraction of such information through text mining. We propose establishing a database linking quantitative protein expression levels with specific tumor classifications through NLP. Materials and Methods: Our method takes advantage of typical forms of representing experimental findings in terms of percentages of protein expression manifest by the tumor population under study. Characteristically, percentages are represented straightforwardly with the % symbol or as the number of positive findings of the total population. Such text is readily recognized using regular expressions and templates permitting extraction of sentences containing these forms for further analysis using grammatical structures and rule-based algorithms. Results: Our pilot study is limited to the extraction of such information related to lymphomas. We achieved a satisfactory level of retrieval as reflected in scores of 69.91% precision and 57.25% recall with an F-score of 62.95%. In addition, we demonstrate the utility of a web-based curation tool for confirming and correcting our findings. Conclusions: The experimental pathology literature represents a rich source of pathobiological information, which has been relatively underutilized. There has been a combinatorial explosion of knowledge within the pathology domain as represented by increasing numbers of immunophenotypes and disease subclassifications. NLP techniques support practical text mining techniques for extracting this knowledge and organizing it in forms appropriate for pathology decision support systems.

  5. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  6. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  7. Particle Tracking Model and Abstraction of Transport Processes

    Energy Technology Data Exchange (ETDEWEB)

    B. Robinson

    2004-10-21

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data.

  8. Situation models, mental simulations, and abstract concepts in discourse comprehension.

    Science.gov (United States)

    Zwaan, Rolf A

    2016-08-01

    This article sets out to examine the role of symbolic and sensorimotor representations in discourse comprehension. It starts out with a review of the literature on situation models, showing how mental representations are constrained by linguistic and situational factors. These ideas are then extended to more explicitly include sensorimotor representations. Following Zwaan and Madden (2005), the author argues that sensorimotor and symbolic representations mutually constrain each other in discourse comprehension. These ideas are then developed further to propose two roles for abstract concepts in discourse comprehension. It is argued that they serve as pointers in memory, used (1) cataphorically to integrate upcoming information into a sensorimotor simulation, or (2) anaphorically integrate previously presented information into a sensorimotor simulation. In either case, the sensorimotor representation is a specific instantiation of the abstract concept.

  9. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  10. Automated Environment Generation for Software Model Checking

    Science.gov (United States)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  11. ALC: automated reduction of rule-based models

    Directory of Open Access Journals (Sweden)

    Gilles Ernst

    2008-10-01

    Full Text Available Abstract Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.

  12. The Conceptual Integration Modeling Framework: Abstracting from the Multidimensional Model

    CERN Document Server

    Rizzolo, Flavio; Pottinger, Rachel; Wong, Kwok

    2010-01-01

    Data warehouses are overwhelmingly built through a bottom-up process, which starts with the identification of sources, continues with the extraction and transformation of data from these sources, and then loads the data into a set of data marts according to desired multidimensional relational schemas. End user business intelligence tools are added on top of the materialized multidimensional schemas to drive decision making in an organization. Unfortunately, this bottom-up approach is costly both in terms of the skilled users needed and the sheer size of the warehouses. This paper proposes a top-down framework in which data warehousing is driven by a conceptual model. The framework offers both design time and run time environments. At design time, a business user first uses the conceptual modeling language as a multidimensional object model to specify what business information is needed; then she maps the conceptual model to a pre-existing logical multidimensional representation. At run time, a system will tra...

  13. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  14. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  15. Automated smoother for the numerical decoupling of dynamics models

    Directory of Open Access Journals (Sweden)

    Santos Helena

    2007-08-01

    Full Text Available Abstract Background Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN, which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure. Results In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from in-vivo NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations. Conclusion The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of

  16. Local Martingale and Pathwise Solutions for an Abstract Fluids Model

    OpenAIRE

    Debussche, Arnaud; Glatt-Holtz, Nathan; Temam, Roger

    2010-01-01

    We establish the existence and uniqueness of both local martingale and local pathwise solutions of an abstract nonlinear stochastic evolution system. The primary application of this abstract framework is to infer the local existence of strong, pathwise solutions to the 3D primitive equations of the oceans and atmosphere forced by a nonlinear multiplicative white noise. Instead of developing our results specifically for the 3D primitive equations we choose to develop them in a slightly abstrac...

  17. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  18. Modeling, Instrumentation, Automation, and Optimization of Water Resource Recovery Facilities.

    Science.gov (United States)

    Sweeney, Michael W; Kabouris, John C

    2016-10-01

    A review of the literature published in 2015 on topics relating to water resource recovery facilities (WRRF) in the areas of modeling, automation, measurement and sensors and optimization of wastewater treatment (or water resource reclamation) is presented.

  19. Automated modelling of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Aach John

    2002-11-01

    Full Text Available Abstract Background Intracellular signal transduction is achieved by networks of proteins and small molecules that transmit information from the cell surface to the nucleus, where they ultimately effect transcriptional changes. Understanding the mechanisms cells use to accomplish this important process requires a detailed molecular description of the networks involved. Results We have developed a computational approach for generating static models of signal transduction networks which utilizes protein-interaction maps generated from large-scale two-hybrid screens and expression profiles from DNA microarrays. Networks are determined entirely by integrating protein-protein interaction data with microarray expression data, without prior knowledge of any pathway intermediates. In effect, this is equivalent to extracting subnetworks of the protein interaction dataset whose members have the most correlated expression profiles. Conclusion We show that our technique accurately reconstructs MAP Kinase signaling networks in Saccharomyces cerevisiae. This approach should enhance our ability to model signaling networks and to discover new components of known networks. More generally, it provides a method for synthesizing molecular data, either individual transcript abundance measurements or pairwise protein interactions, into higher level structures, such as pathways and networks.

  20. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  1. Automation Marketplace 2010: New Models, Core Systems

    Science.gov (United States)

    Breeding, Marshall

    2010-01-01

    In a year when a difficult economy presented fewer opportunities for immediate gains, the major industry players have defined their business strategies with fundamentally different concepts of library automation. This is no longer an industry where companies compete on the basis of the best or the most features in similar products but one where…

  2. Abstraction and Model Checking in the PEPA Plug-in for Eclipse

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    lead to very large Markov chains. One way of analysing such models is to use abstraction - constructing a smaller model that bounds the properties of the original. We present an extension to the PEPA plug-in for Eclipse that enables abstracting and model checking of PEPA models. This implements two new...

  3. Mathematical models in marketing a collection of abstracts

    CERN Document Server

    Funke, Ursula H

    1976-01-01

    Mathematical models can be classified in a number of ways, e.g., static and dynamic; deterministic and stochastic; linear and nonlinear; individual and aggregate; descriptive, predictive, and normative; according to the mathematical technique applied or according to the problem area in which they are used. In marketing, the level of sophistication of the mathe­ matical models varies considerably, so that a nurnber of models will be meaningful to a marketing specialist without an extensive mathematical background. To make it easier for the nontechnical user we have chosen to classify the models included in this collection according to the major marketing problem areas in which they are applied. Since the emphasis lies on mathematical models, we shall not as a rule present statistical models, flow chart models, computer models, or the empirical testing aspects of these theories. We have also excluded competitive bidding, inventory and transportation models since these areas do not form the core of ·the market...

  4. Using Abstraction and Model Checking to Detect Safety Violations in Requirements Specifications

    Science.gov (United States)

    1998-11-01

    into a corresponding scenario in the original state machine model 6.4 Combining Automatic Abstraction Methods with Other Methods After automatic...abstraction methods, such as those de- scribed above, are used to produce a reduced state machine model , other abstraction methods, such as those described...generation of invari- ants [46], may further reduce the state space. Moreover, once the specification of the state machine model has been reduced

  5. Compositional Abstraction of PEPA Models for Transient Analysis

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    - or interval - Markov chains allow us to aggregate states in such a way as to safely bound transient probabilities of the original Markov chain. Whilst we can apply this technique directly to a PEPA model, it requires us to obtain the CTMC of the model, whose state space may be too large to construct...

  6. Modelling and simulation of superalloys. Book of abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Rogal, Jutta; Hammerschmidt, Thomas; Drautz, Ralf (eds.)

    2014-07-01

    Superalloys are multi-component materials with complex microstructures that offer unique properties for high-temperature applications. The complexity of the superalloy materials makes it particularly challenging to obtain fundamental insight into their behaviour from the atomic structure to turbine blades. Recent advances in modelling and simulation of superalloys contribute to a better understanding and prediction of materials properties and therefore offer guidance for the development of new alloys. This workshop will give an overview of recent progress in modelling and simulation of materials for superalloys, with a focus on single crystal Ni-base and Co-base alloys. Topics will include electronic structure methods, atomistic simulations, microstructure modelling and modelling of microstructural evolution, solidification and process simulation as well as the modelling of phase stability and thermodynamics.

  7. Particle Tracking Model and Abstraction of Transport Processes

    Energy Technology Data Exchange (ETDEWEB)

    B. Robinson

    2000-04-07

    The purpose of the transport methodology and component analysis is to provide the numerical methods for simulating radionuclide transport and model setup for transport in the unsaturated zone (UZ) site-scale model. The particle-tracking method of simulating radionuclide transport is incorporated into the FEHM computer code and the resulting changes in the FEHM code are to be submitted to the software configuration management system. This Analysis and Model Report (AMR) outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the unsaturated zone at Yucca Mountain. In addition, methods for determining colloid-facilitated transport parameters are outlined for use in the Total System Performance Assessment (TSPA) analyses. Concurrently, process-level flow model calculations are being carrier out in a PMR for the unsaturated zone. The computer code TOUGH2 is being used to generate three-dimensional, dual-permeability flow fields, that are supplied to the Performance Assessment group for subsequent transport simulations. These flow fields are converted to input files compatible with the FEHM code, which for this application simulates radionuclide transport using the particle-tracking algorithm outlined in this AMR. Therefore, this AMR establishes the numerical method and demonstrates the use of the model, but the specific breakthrough curves presented do not necessarily represent the behavior of the Yucca Mountain unsaturated zone.

  8. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is consider

  9. Model-driven design, refinement and transformation of abstract interactions

    NARCIS (Netherlands)

    Almeida, João Paolo A.; Dijkman, Remco; Ferreira Pires, Luis; Quartel, Dick; Sinderen, van Marten

    2006-01-01

    In a model-driven design process the interaction between application parts can be described at various levels of platform-independence. At the lowest level of platform-independence, interaction is realized by interaction mechanisms provided by specific middleware platforms. At higher levels of platf

  10. Parmodel: a web server for automated comparative modeling of proteins.

    Science.gov (United States)

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  11. Dynamics Model Abstraction Scheme Using Radial Basis Functions

    Directory of Open Access Journals (Sweden)

    Silvia Tolu

    2012-01-01

    Full Text Available This paper presents a control model for object manipulation. Properties of objects and environmental conditions influence the motor control and learning. System dynamics depend on an unobserved external context, for example, work load of a robot manipulator. The dynamics of a robot arm change as it manipulates objects with different physical properties, for example, the mass, shape, or mass distribution. We address active sensing strategies to acquire object dynamical models with a radial basis function neural network (RBF. Experiments are done using a real robot’s arm, and trajectory data are gathered during various trials manipulating different objects. Biped robots do not have high force joint servos and the control system hardly compensates all the inertia variation of the adjacent joints and disturbance torque on dynamic gait control. In order to achieve smoother control and lead to more reliable sensorimotor complexes, we evaluate and compare a sparse velocity-driven versus a dense position-driven control scheme.

  12. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  13. Hierarchical Model Predictive Control for Sustainable Building Automation

    Directory of Open Access Journals (Sweden)

    Barbara Mayer

    2017-02-01

    Full Text Available A hierarchicalmodel predictive controller (HMPC is proposed for flexible and sustainable building automation. The implications of a building automation system for sustainability are defined, and model predictive control is introduced as an ideal tool to cover all requirements. The HMPC is presented as a development suitable for the optimization of modern buildings, as well as retrofitting. The performance and flexibility of the HMPC is demonstrated by simulation studies of a modern office building, and the perfect interaction with future smart grids is shown.

  14. Improving automation standards via semantic modelling: Application to ISA88.

    Science.gov (United States)

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method.

  15. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  16. Semi-Automated Design Space Exploration for Formal Modelling

    OpenAIRE

    Grov, Gudmund; Ireland, Andrew; Llano, Maria Teresa; Kovacs, Peter; Colton, Simon; Gow, Jeremy

    2016-01-01

    Refinement based formal methods allow the modelling of systems through incremental steps via abstraction. Discovering the right levels of abstraction, formulating correct and meaningful invariants, and analysing faulty models are some of the challenges faced when using this technique. Here, we propose Design Space Exploration, an approach that aims to assist a designer by automatically providing high-level modelling guidance in real-time. More specifically, through the combination of common p...

  17. An automated in vitro model for the evaluation of ultrasound modalities measuring myocardial deformation

    Directory of Open Access Journals (Sweden)

    Stigö Albin

    2010-09-01

    Full Text Available Abstract Background Echocardiography is the method of choice when one wishes to examine myocardial function. Qualitative assessment of the 2D grey scale images obtained is subjective, and objective methods are required. Speckle Tracking Ultrasound is an emerging technology, offering an objective mean of quantifying left ventricular wall motion. However, before a new ultrasound technology can be adopted in the clinic, accuracy and reproducibility needs to be investigated. Aim It was hypothesized that the collection of ultrasound sample data from an in vitro model could be automated. The aim was to optimize an in vitro model to allow for efficient collection of sample data. Material & Methods A tissue-mimicking phantom was made from water, gelatin powder, psyllium fibers and a preservative. Sonomicrometry crystals were molded into the phantom. The solid phantom was mounted in a stable stand and cyclically compressed. Peak strain was then measured by Speckle Tracking Ultrasound and sonomicrometry. Results We succeeded in automating the acquisition and analysis of sample data. Sample data was collected at a rate of 200 measurement pairs in 30 minutes. We found good agreement between Speckle Tracking Ultrasound and sonomicrometry in the in vitro model. Best agreement was 0.83 ± 0.70%. Worst agreement was -1.13 ± 6.46%. Conclusions It has been shown possible to automate a model that can be used for evaluating the in vitro accuracy and precision of ultrasound modalities measuring deformation. Sonomicrometry and Speckle Tracking Ultrasound had acceptable agreement.

  18. Automation of electroweak NLO corrections in general models

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Jean-Nicolas [Universitaet Wuerzburg (Germany)

    2016-07-01

    I discuss the automation of generation of scattering amplitudes in general quantum field theories at next-to-leading order in perturbation theory. The work is based on Recola, a highly efficient one-loop amplitude generator for the Standard Model, which I have extended so that it can deal with general quantum field theories. Internally, Recola computes off-shell currents and for new models new rules for off-shell currents emerge which are derived from the Feynman rules. My work relies on the UFO format which can be obtained by a suited model builder, e.g. FeynRules. I have developed tools to derive the necessary counterterm structures and to perform the renormalization within Recola in an automated way. I describe the procedure using the example of the two-Higgs-doublet model.

  19. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    S) Leelinda P Dawson, John W Raby, and Jeffrey A Smith 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION...model runs .............................13 Fig. 6 An example PSA log file, ps_auto_log, using DDA, one case- study date, 3 domains, 3 model runs, and...case study date could be set for each run. This process was time-consuming when multiple configurations were required by the user. Also, each run

  20. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However, the framewo......Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However......, the frameworks are often adapted from other purposes, usually applied to a limited range of problems, sometimes not fully described in the open literature, and rarely critically reviewed in a manner acceptable to proponents and critics alike. The present paper introduces a panel session wherein these proponents...

  1. A Process Model of Trust in Automation: A Signal Detection Theory Based Approach

    Science.gov (United States)

    2014-01-01

    lead to trust in automation. We also discuss a simple process model , which helps us understand the results. Our experimental paradigm suggests that...participants are agnostic to the automation s behavior; instead, they merely focus on alarm rate. A process model suggests this is the result of a simple reward structure and a non-explicit cost of trusting the automation.

  2. Automated photogrammetry for three-dimensional models of urban spaces

    Science.gov (United States)

    Leberl, Franz; Meixner, Philipp; Wendel, Andreas; Irschara, Arnold

    2012-02-01

    The location-aware Internet is inspiring intensive work addressing the automated assembly of three-dimensional models of urban spaces with their buildings, circulation spaces, vegetation, signs, even their above-ground and underground utility lines. Two-dimensional geographic information systems (GISs) and municipal utility information exist and can serve to guide the creation of models being built with aerial, sometimes satellite imagery, streetside images, indoor imaging, and alternatively with light detection and ranging systems (LiDARs) carried on airplanes, cars, or mounted on tripods. We review the results of current research to automate the information extraction from sensor data. We show that aerial photography at ground sampling distances (GSD) of 1 to 10 cm is well suited to provide geometry data about building facades and roofs, that streetside imagery at 0.5 to 2 cm is particularly interesting when it is collected within community photo collections (CPCs) by the general public, and that the transition to digital imaging has opened the no-cost option of highly overlapping images in support of a more complete and thus more economical automation. LiDAR-systems are a widely used source of three-dimensional data, but they deliver information not really superior to digital photography.

  3. Abstracting and reasoning over ship trajectories and web data with the Simple Event Model (SEM)

    NARCIS (Netherlands)

    W.R. van Hage; V. Malaisé; G.K.D. de Vries; A.Th. Schreiber; M.W. van Someren

    2012-01-01

    Bridging the gap between low-level features and semantics is a problem commonly acknowledged in the Multimedia community. Event modeling can fill this gap by representing knowledge about the data at different level of abstraction. In this paper we present the Simple Event Model (SEM) and its applica

  4. Abstract behavior types : a foundation model for components and their composition

    NARCIS (Netherlands)

    Arbab, F.

    2003-01-01

    The notion of Abstract Data Type (ADT) has served as a foundation model for structured and object oriented programming for some thirty years. The current trend in software engineering toward component based systems requires a foundation model as well. The most basic inherent property of an ADT, i.e.

  5. Mathematical models of magnetite desliming for automated quality control systems

    Science.gov (United States)

    Olevska, Yu.; Mishchenko, V.; Olevskyi, V.

    2016-10-01

    The aim of the study is to provide multifactor mathematical models suitable for use in automatic control systems of desliming process. For this purpose we described the motion of a two-phase environment regard to the shape the desliming machine and technological parameters of the enrichment process. We created the method for preparation of dependences of the enrichment process quality from the technological and design parameters. To automate the process we constructed mathematical models to justify intensive technological modes and optimal parameters for design of desliming machine.

  6. Selected Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Three - Stage Interaction Model of Lexicon - Syntax Interface Liu Yuhong (80) The Three -Stage Interaction Model maintains that the interactions at the lexicon -syntax interface are divisible into three levels, namely, the interaction between lexical meaning and lexical grammar that determines syntactic items, the interaction among lexical items that determines syntactic structure, and the interaction between abstract syntactic structure (i. e. construction) and temporary syntactic combinations that determines and coerces grammaticality of the latter. The Three - Stage Interaction Model is hierarchical, complete and bidirectional in language comprehension. It also testifies to the varying abstractness between grammar (syntax) and semantics, and between the five grammatical cases. According to this model, temporary syntactic combination is sanctioned by abstract syntactic structure, therefore the conventional linguistic significance of P600 is maintained without coining contradictory new terms.

  7. Abstraction for Epistemic Model Checking of Dining Cryptographers-based Protocols

    CERN Document Server

    Al-Bataineh, Omar I

    2010-01-01

    The paper describes an abstraction for protocols that are based on multiple rounds of Chaum's Dining Cryptographers protocol. It is proved that the abstraction preserves a rich class of specifications in the logic of knowledge, including specifications describing what an agent knows about other agents' knowledge. This result can be used to optimize model checking of Dining Cryptographers-based protocols, and applied within a methodology for knowledge-based program implementation and verification. Some case studies of such an application are given, for a protocol that uses the Dining Cryptographers protocol as a primitive in an anonymous broadcast system. Performance results are given for model checking knowledge-based specifications in the concrete and abstract models of this protocol, and some new conclusions about the protocol are derived.

  8. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  9. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology.

  10. Combining search space partition and search Space partition and abstraction for LTL model checking

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The state space explosion problem is still the key obstacle for applying model checking to systems of industrial size.Abstraction-based methods have been particularly successful in this regard.This paper presents an approach based on refinement of search space partition and abstraction which combines these two techniques for reducing the complexity of model checking.The refinement depends on the representation of each portion of search space. Especially, search space can be refined stepwise to get a better reduction. As reported in the case study, the Integration of search space partition and abstraction improves the efficiencyof verification with respect to the requirement of memory and obtains significant advantage over the use of each of them in isolation.

  11. First Look at Photometric Reduction via Mixed-Model Regression (Poster abstract)

    Science.gov (United States)

    Dose, E.

    2016-12-01

    (Abstract only) Mixed-model regression is proposed as a new approach to photometric reduction, especially for variable-star photometry in several filters. Mixed-model regression adds to normal multivariate regression certain "random effects": categorical-variable terms that model and extract specific systematic errors such as image-to-image zero-point fluctuations (cirrus effect) or even errors in comp-star catalog magnitudes.

  12. Geochemistry Model Abstraction and Sensitivity Studies for the 21 PWR CSNF Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    P. Bernot; S. LeStrange; E. Thomas; K. Zarrabi; S. Arthur

    2002-10-29

    The CSNF geochemistry model abstraction, as directed by the TWP (BSC 2002b), was developed to provide regression analysis of EQ6 cases to obtain abstracted values of pH (and in some cases HCO{sub 3}{sup -} concentration) for use in the Configuration Generator Model. The pH of the system is the controlling factor over U mineralization, CSNF degradation rate, and HCO{sub 3}{sup -} concentration in solution. The abstraction encompasses a large variety of combinations for the degradation rates of materials. The ''base case'' used EQ6 simulations looking at differing steel/alloy corrosion rates, drip rates, and percent fuel exposure. Other values such as the pH/HCO{sub 3}{sup -} dependent fuel corrosion rate and the corrosion rate of A516 were kept constant. Relationships were developed for pH as a function of these differing rates to be used in the calculation of total C and subsequently, the fuel rate. An additional refinement to the abstraction was the addition of abstracted pH values for cases where there was limited O{sub 2} for waste package corrosion and a flushing fluid other than J-13, which has been used in all EQ6 calculation up to this point. These abstractions also used EQ6 simulations with varying combinations of corrosion rates of materials to abstract the pH (and HCO{sub 3}{sup -} in the case of the limiting O{sub 2} cases) as a function of WP materials corrosion rates. The goodness of fit for most of the abstracted values was above an R{sup 2} of 0.9. Those below this value occurred during the time at the very beginning of WP corrosion when large variations in the system pH are observed. However, the significance of F-statistic for all the abstractions showed that the variable relationships are significant. For the abstraction, an analysis of the minerals that may form the ''sludge'' in the waste package was also presented. This analysis indicates that a number a different iron and aluminum minerals may form in

  13. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  14. Reliability of travel times to groundwater abstraction wells: Application of the Netherlands Groundwater Model - LGM

    NARCIS (Netherlands)

    Kovar K; Leijnse A; Uffink G; Pastoors MJH; Mulschlegel JHC; Zaadnoordijk WJ; LDL; IMD; TNO/NITG; Haskoning

    2005-01-01

    A modelling approach was developed, incorporated in the finite-element method based program LGMLUC, making it possible to determine the reliability of travel times of groundwater flowing to groundwater abstraction sites. The reliability is seen here as a band (zone) around the expected travel-time i

  15. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  16. Random many-particle systems: applications from biology, and propagation of chaos in abstract models

    CERN Document Server

    Wennberg, Bernt

    2011-01-01

    The paper discusses a family of Markov processes that represent many particle systems, and their limiting behaviour when the number of particles go to infinity. The first part concerns model of biological systems: a model for sympatric speciation, i.e. the process in which a genetically homogeneous population is split in two or more different species sharing the same habitat, and models for swarming animals. The second part of the paper deals with abstract many particle systems, and methods for rigorously deriving mean field models.

  17. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  18. Development of an automated core model for nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Mosteller, R.D.

    1998-12-31

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop an automated package of computer codes that can model the steady-state behavior of nuclear-reactor cores of various designs. As an added benefit, data produced for steady-state analysis also can be used as input to the TRAC transient-analysis code for subsequent safety analysis of the reactor at any point in its operating lifetime. The basic capability to perform steady-state reactor-core analysis already existed in the combination of the HELIOS lattice-physics code and the NESTLE advanced nodal code. In this project, the automated package was completed by (1) obtaining cross-section libraries for HELIOS, (2) validating HELIOS by comparing its predictions to results from critical experiments and from the MCNP Monte Carlo code, (3) validating NESTLE by comparing its predictions to results from numerical benchmarks and to measured data from operating reactors, and (4) developing a linkage code to transform HELIOS output into NESTLE input.

  19. Exploiting mid-range DNA patterns for sequence classification: binary abstraction Markov models

    Science.gov (United States)

    Shepard, Samuel S.; McSweeny, Andrew; Serpen, Gursel; Fedorov, Alexei

    2012-01-01

    Messenger RNA sequences possess specific nucleotide patterns distinguishing them from non-coding genomic sequences. In this study, we explore the utilization of modified Markov models to analyze sequences up to 44 bp, far beyond the 8-bp limit of conventional Markov models, for exon/intron discrimination. In order to analyze nucleotide sequences of this length, their information content is first reduced by conversion into shorter binary patterns via the application of numerous abstraction schemes. After the conversion of genomic sequences to binary strings, homogenous Markov models trained on the binary sequences are used to discriminate between exons and introns. We term this approach the Binary Abstraction Markov Model (BAMM). High-quality abstraction schemes for exon/intron discrimination are selected using optimization algorithms on supercomputers. The best MM classifiers are then combined using support vector machines into a single classifier. With this approach, over 95% classification accuracy is achieved without taking reading frame into account. With further development, the BAMM approach can be applied to sequences lacking the genetic code such as ncRNAs and 5′-untranslated regions. PMID:22344692

  20. Programming with models: modularity and abstraction provide powerful capabilities for systems biology.

    Science.gov (United States)

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2009-03-06

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously.

  1. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  2. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  3. An initial-abstraction, constant-loss model for unit hydrograph modeling for applicable watersheds in Texas

    Science.gov (United States)

    Asquith, William H.; Roussel, Meghan C.

    2007-01-01

    Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is

  4. Virtual Machine Support for Many-Core Architectures: Decoupling Abstract from Concrete Concurrency Models

    Directory of Open Access Journals (Sweden)

    Stefan Marr

    2010-02-01

    Full Text Available The upcoming many-core architectures require software developers to exploit concurrency to utilize available computational power. Today's high-level language virtual machines (VMs, which are a cornerstone of software development, do not provide sufficient abstraction for concurrency concepts. We analyze concrete and abstract concurrency models and identify the challenges they impose for VMs. To provide sufficient concurrency support in VMs, we propose to integrate concurrency operations into VM instruction sets. Since there will always be VMs optimized for special purposes, our goal is to develop a methodology to design instruction sets with concurrency support. Therefore, we also propose a list of trade-offs that have to be investigated to advise the design of such instruction sets. As a first experiment, we implemented one instruction set extension for shared memory and one for non-shared memory concurrency. From our experimental results, we derived a list of requirements for a full-grown experimental environment for further research.

  5. HLA-Modeler: Automated Homology Modeling of Human Leukocyte Antigens

    Directory of Open Access Journals (Sweden)

    Shinji Amari

    2013-01-01

    Full Text Available The three-dimensional (3D structures of human leukocyte antigen (HLA molecules are indispensable for the studies on the functions at molecular level. We have developed a homology modeling system named HLA-modeler specialized in the HLA molecules. Segment matching algorithm is employed for modeling and the optimization of the model is carried out by use of the PFROSST force field considering the implicit solvent model. In order to efficiently construct the homology models, HLA-modeler uses a local database of the 3D structures of HLA molecules. The structure of the antigenic peptide-binding site is important for the function and the 3D structure is highly conserved between various alleles. HLA-modeler optimizes the use of this structural motif. The leave-one-out cross-validation using the crystal structures of class I and class II HLA molecules has demonstrated that the rmsds of nonhydrogen atoms of the sites between homology models and crystal structures are less than 1.0 Å in most cases. The results have indicated that the 3D structures of the antigenic peptide-binding sites can be reproduced by HLA-modeler at the level almost corresponding to the crystal structures.

  6. HLA-Modeler: Automated Homology Modeling of Human Leukocyte Antigens.

    Science.gov (United States)

    Amari, Shinji; Kataoka, Ryoichi; Ikegami, Takashi; Hirayama, Noriaki

    2013-01-01

    The three-dimensional (3D) structures of human leukocyte antigen (HLA) molecules are indispensable for the studies on the functions at molecular level. We have developed a homology modeling system named HLA-modeler specialized in the HLA molecules. Segment matching algorithm is employed for modeling and the optimization of the model is carried out by use of the PFROSST force field considering the implicit solvent model. In order to efficiently construct the homology models, HLA-modeler uses a local database of the 3D structures of HLA molecules. The structure of the antigenic peptide-binding site is important for the function and the 3D structure is highly conserved between various alleles. HLA-modeler optimizes the use of this structural motif. The leave-one-out cross-validation using the crystal structures of class I and class II HLA molecules has demonstrated that the rmsds of nonhydrogen atoms of the sites between homology models and crystal structures are less than 1.0 Å in most cases. The results have indicated that the 3D structures of the antigenic peptide-binding sites can be reproduced by HLA-modeler at the level almost corresponding to the crystal structures.

  7. Model-Based Control for Postal Automation and Baggage Handling

    NARCIS (Netherlands)

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased consider

  8. Applications of Bayesian temperature profile reconstruction to automated comparison with heat transport models and uncertainty quantification of current diffusion

    Energy Technology Data Exchange (ETDEWEB)

    Irishkin, M. [CEA, IRFM, F-13108 Saint-Paul-Lez-Durance (France); Imbeaux, F., E-mail: frederic.imbeaux@cea.fr [CEA, IRFM, F-13108 Saint-Paul-Lez-Durance (France); Aniel, T.; Artaud, J.F. [CEA, IRFM, F-13108 Saint-Paul-Lez-Durance (France)

    2015-11-15

    Highlights: • We developed a method for automated comparison of experimental data with models. • A unique platform implements Bayesian analysis and integrated modelling tools. • The method is tokamak-generic and is applied to Tore Supra and JET pulses. • Validation of a heat transport model is carried out. • We quantified the uncertainties due to Te profiles in current diffusion simulations. - Abstract: In the context of present and future long pulse tokamak experiments yielding a growing size of measured data per pulse, automating data consistency analysis and comparisons of measurements with models is a critical matter. To address these issues, the present work describes an expert system that carries out in an integrated and fully automated way (i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis (ii) a prediction of the reconstructed quantities, according to some models and (iii) a comparison of the first two steps. The first application shown is devoted to the development of an automated comparison method between the experimental plasma profiles reconstructed using Bayesian methods and time dependent solutions of the transport equations. The method was applied to model validation of a simple heat transport model with three radial shape options. It has been tested on a database of 21 Tore Supra and 14 JET shots. The second application aims at quantifying uncertainties due to the electron temperature profile in current diffusion simulations. A systematic reconstruction of the Ne, Te, Ti profiles was first carried out for all time slices of the pulse. The Bayesian 95% highest probability intervals on the Te profile reconstruction were then used for (i) data consistency check of the flux consumption and (ii) defining a confidence interval for the current profile simulation. The method has been applied to one Tore Supra pulse and one JET pulse.

  9. Individual Differences in Response to Automation: The Five Factor Model of Personality

    Science.gov (United States)

    Szalma, James L.; Taylor, Grant S.

    2011-01-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness--whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five…

  10. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  11. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  12. INVENTORY ABSTRACTION

    Energy Technology Data Exchange (ETDEWEB)

    G. Ragan

    2001-12-19

    The purpose of the inventory abstraction, which has been prepared in accordance with a technical work plan (CRWMS M&O 2000e for ICN 02 of the present analysis, and BSC 2001e for ICN 03 of the present analysis), is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M&O 2000c, 2000f). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) key technical issue (KTI): ''The rate at which radionuclides in SNF [spent nuclear fuel] are released from the EBS [engineered barrier system] through the oxidation and dissolution of spent fuel'' (NRC 1999, Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest

  13. Inventory Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    C. Leigh

    2000-11-03

    The purpose of the inventory abstraction as directed by the development plan (CRWMS M&O 1999b) is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M&O 1999c, 1999d). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) (NRC 1999) key technical issue (KTI): ''The rate at which radionuclides in SNF [Spent Nuclear Fuel] are released from the EBS [Engineered Barrier System] through the oxidation and dissolution of spent fuel'' (Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest contributors to the dose given a release to the accessible environment. The inventory abstraction is important in

  14. Automated forward mechanical modeling of wrinkle ridges on Mars

    Science.gov (United States)

    Nahm, Amanda; Peterson, Samuel

    2016-04-01

    One of the main goals of the InSight mission to Mars is to understand the internal structure of Mars [1], in part through passive seismology. Understanding the shallow surface structure of the landing site is critical to the robust interpretation of recorded seismic signals. Faults, such as the wrinkle ridges abundant in the proposed landing site in Elysium Planitia, can be used to determine the subsurface structure of the regions they deform. Here, we test a new automated method for modeling of the topography of a wrinkle ridge (WR) in Elysium Planitia, allowing for faster and more robust determination of subsurface fault geometry for interpretation of the local subsurface structure. We perform forward mechanical modeling of fault-related topography [e.g., 2, 3], utilizing the modeling program Coulomb [4, 5] to model surface displacements surface induced by blind thrust faulting. Fault lengths are difficult to determine for WR; we initially assume a fault length of 30 km, but also test the effects of different fault lengths on model results. At present, we model the wrinkle ridge as a single blind thrust fault with a constant fault dip, though WR are likely to have more complicated fault geometry [e.g., 6-8]. Typically, the modeling is performed using the Coulomb GUI. This approach can be time consuming, requiring user inputs to change model parameters and to calculate the associated displacements for each model, which limits the number of models and parameter space that can be tested. To reduce active user computation time, we have developed a method in which the Coulomb GUI is bypassed. The general modeling procedure remains unchanged, and a set of input files is generated before modeling with ranges of pre-defined parameter values. The displacement calculations are divided into two suites. For Suite 1, a total of 3770 input files were generated in which the fault displacement (D), dip angle (δ), depth to upper fault tip (t), and depth to lower fault tip (B

  15. Automated Segmentation of Cardiac Magnetic Resonance Images

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Nilsson, Jens Chr.; Grønning, Bjørn A.

    2001-01-01

    is based on determination of the left-ventricular endocardial and epicardial borders. Since manual border detection is laborious, automated segmentation is highly desirable as a fast, objective and reproducible alternative. Automated segmentation will thus enhance comparability between and within cardiac...... studies and increase accuracy by allowing acquisition of thinner MRI-slices. This abstract demonstrates that statistical models of shape and appearance, namely the deformable models: Active Appearance Models, can successfully segment cardiac MRIs....

  16. Automated 4D analysis of dendritic spine morphology: applications to stimulus-induced spine remodeling and pharmacological rescue in a disease model

    Directory of Open Access Journals (Sweden)

    Swanger Sharon A

    2011-10-01

    Full Text Available Abstract Uncovering the mechanisms that regulate dendritic spine morphology has been limited, in part, by the lack of efficient and unbiased methods for analyzing spines. Here, we describe an automated 3D spine morphometry method and its application to spine remodeling in live neurons and spine abnormalities in a disease model. We anticipate that this approach will advance studies of synapse structure and function in brain development, plasticity, and disease.

  17. Automated 4D analysis of dendritic spine morphology: applications to stimulus-induced spine remodeling and pharmacological rescue in a disease model

    OpenAIRE

    2011-01-01

    Abstract Uncovering the mechanisms that regulate dendritic spine morphology has been limited, in part, by the lack of efficient and unbiased methods for analyzing spines. Here, we describe an automated 3D spine morphometry method and its application to spine remodeling in live neurons and spine abnormalities in a disease model. We anticipate that this approach will advance studies of synapse structure and function in brain development, plasticity, and disease.

  18. Automated alignment-based curation of gene models in filamentous fungi

    OpenAIRE

    2014-01-01

    Background Automated gene-calling is still an error-prone process, particularly for the highly plastic genomes of fungal species. Improvement through quality control and manual curation of gene models is a time-consuming process that requires skilled biologists and is only marginally performed. The wealth of available fungal genomes has not yet been exploited by an automated method that applies quality control of gene models in order to obtain more accurate genome annotations. Results We prov...

  19. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  20. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  1. Automated MRI segmentation for individualized modeling of current flow in the human head

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully

  2. Context based mixture model for cell phase identification in automated fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Zhou Xiaobo

    2007-01-01

    Full Text Available Abstract Background Automated identification of cell cycle phases of individual live cells in a large population captured via automated fluorescence microscopy technique is important for cancer drug discovery and cell cycle studies. Time-lapse fluorescence microscopy images provide an important method to study the cell cycle process under different conditions of perturbation. Existing methods are limited in dealing with such time-lapse data sets while manual analysis is not feasible. This paper presents statistical data analysis and statistical pattern recognition to perform this task. Results The data is generated from Hela H2B GFP cells imaged during a 2-day period with images acquired 15 minutes apart using an automated time-lapse fluorescence microscopy. The patterns are described with four kinds of features, including twelve general features, Haralick texture features, Zernike moment features, and wavelet features. To generate a new set of features with more discriminate power, the commonly used feature reduction techniques are used, which include Principle Component Analysis (PCA, Linear Discriminant Analysis (LDA, Maximum Margin Criterion (MMC, Stepwise Discriminate Analysis based Feature Selection (SDAFS, and Genetic Algorithm based Feature Selection (GAFS. Then, we propose a Context Based Mixture Model (CBMM for dealing with the time-series cell sequence information and compare it to other traditional classifiers: Support Vector Machine (SVM, Neural Network (NN, and K-Nearest Neighbor (KNN. Being a standard practice in machine learning, we systematically compare the performance of a number of common feature reduction techniques and classifiers to select an optimal combination of a feature reduction technique and a classifier. A cellular database containing 100 manually labelled subsequence is built for evaluating the performance of the classifiers. The generalization error is estimated using the cross validation technique. The

  3. Modelling and experimental study for automated congestion driving

    NARCIS (Netherlands)

    Urhahne, J.A.; Piastowski, P.; Voort, van der M.C.; Bebis, G; Boyle, R.; Parvin, B.; Koracin, D.; Pavlidis, I.; Feris, R.; McGraw, T.; Elendt, M.; Kopper, R.; Ragan, E.; Ye, Z.; Weber, G.

    2015-01-01

    Taking a collaborative approach in automated congestion driving with a Traffic Jam Assist system requires the driver to take over control in certain traffic situations. In order to warn the driver appropriately, warnings are issued (“pay attention” vs. “take action”) due to a control transition stra

  4. New E-Commerce Model Based on Multi-Agent Automated Negotiation

    Institute of Scientific and Technical Information of China (English)

    向传杰; 贾云得

    2003-01-01

    A new multi-agent automated negotiation model is developed and evaluated, in which two competitive agents, such as the buyer and seller, have firm deadlines and incomplete information about each other. The negotiation is multi-dimensional in different cases. The model is discussed in 6 kinds of cases with different price strategies, warrantee strategies and time strategies. The model improves the model of Wooldridge and that of Sycara to a certain extent. In all possible situations, the optimal negotiation strategy is analyzed and presented, and an e-commerce model based on multi-agent automated negotiation model is also illustrated for the e-commerce application in the future.

  5. glmulti: An R Package for Easy Automated Model Selection with (Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Vincent Calcagno

    2010-10-01

    Full Text Available We introduce glmulti, an R package for automated model selection and multi-model inference with glm and related functions. From a list of explanatory variables, the provided function glmulti builds all possible unique models involving these variables and, optionally, their pairwise interactions. Restrictions can be specified for candidate models, by excluding specific terms, enforcing marginality, or controlling model complexity. Models are fitted with standard R functions like glm. The n best models and their support (e.g., (QAIC, (QAICc, or BIC are returned, allowing model selection and multi-model inference through standard R functions. The package is optimized for large candidate sets by avoiding memory limitation, facilitating parallelization and providing, in addition to exhaustive screening, a compiled genetic algorithm method. This article briefly presents the statistical framework and introduces the package, with applications to simulated and real data.

  6. Advances in automated valuation modeling AVM after the non-agency mortgage crisis

    CERN Document Server

    Kauko, Tom

    2017-01-01

    This book addresses several problems related to automated valuation methodologies (AVM). Following the non-agency mortgage crisis, it offers a variety of approaches to improve the efficiency and quality of an automated valuation methodology (AVM) dealing with emerging problems and different contexts. Spatial issue, evolution of AVM standards, multilevel models, fuzzy and rough set applications and quantitative methods to define comparables are just some of the topics discussed.

  7. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  8. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  9. Cellular Automation Model of Traffic Flow Based on the Car-Following Model

    Institute of Scientific and Technical Information of China (English)

    LI Ke-Ping; GAO Zi-You

    2004-01-01

    @@ We propose a new cellular automation (CA) traffic model that is based on the car-following model. A class of driving strategies is used in the car-following model instead of the acceleration in the NaSch traffic model. In our model, some realistic driver behaviour and detailed vehicle characteristics have been taken into account, such as distance-headway and safe distance, etc. The simulation results show that our model can exhibit some traffic flow states that have been observed in the real traffic, and both of the maximum flux and the critical density are very close to the real measurement. Moreover, it is easy to extend our method to multi-lane traffic.

  10. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  11. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  12. Automated model-based calibration of imaging spectrographs

    Science.gov (United States)

    Kosec, Matjaž; Bürmen, Miran; Tomaževič, Dejan; Pernuš, Franjo; Likar, Boštjan

    2012-03-01

    Hyper-spectral imaging has gained recognition as an important non-invasive research tool in the field of biomedicine. Among the variety of available hyperspectral imaging systems, systems comprising an imaging spectrograph, lens, wideband illumination source and a corresponding camera stand out for the short acquisition time and good signal to noise ratio. The individual images acquired by imaging spectrograph-based systems contain full spectral information along one spatial dimension. Due to the imperfections in the camera lens and in particular the optical components of the imaging spectrograph, the acquired images are subjected to spatial and spectral distortions, resulting in scene dependent nonlinear spectral degradations and spatial misalignments which need to be corrected. However, the existing correction methods require complex calibration setups and a tedious manual involvement, therefore, the correction of the distortions is often neglected. Such simplified approach can lead to significant errors in the analysis of the acquired hyperspectral images. In this paper, we present a novel fully automated method for correction of the geometric and spectral distortions in the acquired images. The method is based on automated non-rigid registration of the reference and acquired images corresponding to the proposed calibration object incorporating standardized spatial and spectral information. The obtained transformation was successfully used for sub-pixel correction of various hyperspectral images, resulting in significant improvement of the spectral and spatial alignment. It was found that the proposed calibration is highly accurate and suitable for routine use in applications involving either diffuse reflectance or transmittance measurement setups.

  13. MATHEMATICAL MODEL OF REFERRING AUTOMATED SYSTEM DOCUMENTS TO INFORMATION SPHERES OF USERS RESPONSIBILITY

    Directory of Open Access Journals (Sweden)

    Nosenko S. V.

    2013-10-01

    Full Text Available In the article, we present the mathematical model of referring documents entering the automated system to the spheres of users responsibility. The possibility of application of mathematical apparatus of final predicates algebra as a basic means of model description is proved

  14. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  15. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  16. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    Science.gov (United States)

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  17. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  18. The Model and Control Methods of Access to Information and Technology Resources of Automated Control Systems in Water Supply Industry

    Science.gov (United States)

    Rytov, M. Yu; Spichyack, S. A.; Fedorov, V. P.; Petreshin, D. I.

    2017-01-01

    The paper describes a formalized control model of access to information and technological resources of automated control systems at water supply enterprises. The given model considers the availability of various communication links with information systems and technological equipment. There are also studied control methods of access to information and technological resources of automated control systems at water supply enterprises. On the basis of the formalized control model and appropriate methods there was developed a software-hardware complex for rapid access to information and technological resources of automated control systems, which contains an administrator’s automated workplace and ultimate users.

  19. Abstract algebra

    CERN Document Server

    Garrett, Paul B

    2007-01-01

    Designed for an advanced undergraduate- or graduate-level course, Abstract Algebra provides an example-oriented, less heavily symbolic approach to abstract algebra. The text emphasizes specifics such as basic number theory, polynomials, finite fields, as well as linear and multilinear algebra. This classroom-tested, how-to manual takes a more narrative approach than the stiff formalism of many other textbooks, presenting coherent storylines to convey crucial ideas in a student-friendly, accessible manner. An unusual feature of the text is the systematic characterization of objects by universal

  20. Automated upscaling of river networks for macroscale hydrological modeling

    Science.gov (United States)

    Wu, Huan; Kimball, John S.; Mantua, Nate; Stanford, Jack

    2011-03-01

    We developed a hierarchical dominant river tracing (DRT) algorithm for automated extraction and spatial upscaling of basin flow directions and river networks using fine-scale hydrography inputs (e.g., flow direction, river networks, and flow accumulation). In contrast with previous upscaling methods, the DRT algorithm utilizes information on global and local drainage patterns from baseline fine-scale hydrography to determine upscaled flow directions and other critical variables including upscaled basin area, basin shape, and river lengths. The DRT algorithm preserves the original baseline hierarchical drainage structure by tracing each entire flow path from headwater to river mouth at fine scale while prioritizing successively higher order basins and rivers for tracing. We applied the algorithm to produce a series of global hydrography data sets from 1/16° to 2° spatial scales in two geographic projections (WGS84 and Lambert azimuthal equal area). The DRT results were evaluated against other alternative upscaling methods and hydrography data sets for continental U.S. and global domains. These results show favorable DRT upscaling performance in preserving baseline fine-scale river network information including: (1) improved, automated extraction of flow directions and river networks at any spatial scale without the need for manual correction; (2) consistency of river network, basin shape, basin area, river length, and basin internal drainage structure between upscaled and baseline fine-scale hydrography; and (3) performance largely independent of spatial scale, geographic region, and projection. The results of this study include an initial set of DRT upscaled global hydrography maps derived from HYDRO1K baseline fine-scale hydrography inputs; these digital data are available online for public access at ftp://ftp.ntsg.umt.edu/pub/data/DRT/.

  1. Conceptual model of an automated information system of marketing at the enterprise

    Directory of Open Access Journals (Sweden)

    D.V. Raiko

    2014-09-01

    Full Text Available The aim of the article. The purpose of this paper is to create a conceptual model of an automated information system of marketing that has a certain theoretical and practical value. The results of the analysis. The main advantage of this model - a comprehensive disclosure of the relationship of concepts such as automated information technology, marketing information system, automated information system that solve the problem of processing large volumes of data in a short period of time, providing continuous communication with partners and customers and makes it possible to react quickly to market changes, and this in turn contributes to the competitiveness of the domestic and foreign markets. Scientific novelty of this model is, firstly, the assertion that the information system is based on automated information technology presents an automated information the system. Secondly, the marketing information system is an integral part of information system, structural elements are responsible for the transformation of data from internal and external sources of information to information necessary for managers and specialists of marketing services. Thirdly, the most important component of ensuring the functioning of the marketing information system and information system is an automated information technology. Due to the fact that these systems consist of human resources, work within them organized with the help of workstations. Conclusions and directions of further researches. Determined that this conceptual model provides a multi-variant calculations of rational decision-making, including real-time organization of complex accounting and economic analysis, and provides reliability and efficiency obtained and used in the management of information. The results of this model, testing the example of several industries, confirming its practical significance.

  2. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder

    Science.gov (United States)

    Yakubova, Gulnoza; Hughes, Elizabeth M.; Shinaberry, Megan

    2016-01-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the…

  3. Adaptive Software Development supported by an Automated Process: a Reference Model

    Directory of Open Access Journals (Sweden)

    AFFONSO, F. J.

    2013-12-01

    Full Text Available This paper presents a reference model as an automated process to assist the adaptive software development at runtime, also known as Self-adaptive Systems (SaS at runtime. This type of software has specific characteristics in comparison to traditional one, since it allows that changes (structural or behavioral to be incorporated at runtime. Automated processes have been used as a feasible solution to conduct software adaptation at runtime by minimizing human involvement (developers and speeding up the execution of tasks. In parallel, reference models have been used to aggregate knowledge and architectural artifacts, since they capture the systems essence in specific domains. However, presently no there is reference model based on reflection for the automation of software adaptation at runtime. In this scenario, this paper presents a reference model based on reflection, as an automated process, for the development of software systems that require adaptation at runtime. To show the applicability of the model, a case study was conducted and a good perspective to efficiently contribute to the area of SaS has been obtained.

  4. Automated Generation of Digital Terrain Model using Point Clouds of Digital Surface Model in Forest Area

    Directory of Open Access Journals (Sweden)

    Yoshikazu Kamiya

    2011-04-01

    Full Text Available At present, most of the digital data acquisition methods generate Digital Surface Model (DSM and not a Digital Elevation Model (DEM. Conversion from DSM to DEM still has some drawbacks, especially the removing of off terrain point clouds and subsequently the generation of DEM within these spaces even though the methods are automated. In this paper it was intended to overcome this issue by attempting to project off terrain point clouds to the terrain in forest areas using Artificial Neural Networks (ANN instead of removing them and then filling gaps by interpolation. Five sites were tested and accuracies assessed. They all give almost the same results. In conclusion, the ANN has ability to obtain the DEM by projecting the DSM point clouds and greater accuracies of DEMs were obtained. If the size of the hollow areas resulting from the removal of DSM point clouds are larger the accuracies are reduced.

  5. MAIN ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Ideological Critique of Marx and Philosophical Transformation This paper addresses a tendency today that tries to reinterpret the Marxist transformation in philosophy with a "non-ideological" perspective. It explicates the leap that Marx achieved from abstract human spirit and state reason to the proletariat world view, as well as the overall withdrawal from the abstract universalism. Such a leap and withdrawal is not only the key to the materialistic inversion of the Hegelian dialectics, but also the foundation for sticking to and enriching the Marxist philosophy. Without sticking to this direction, we will not be able to clearly understand Marxism as the successor and promoter of the magnificent cultural achievements of mankind, including the enlightenment thoughts and classical German philosophy, nor will we fully understand the value and strength of Marxist philosophy today.

  6. Automating Routine Tasks in AmI Systems by Using Models at Runtime

    Science.gov (United States)

    Serral, Estefanía; Valderas, Pedro; Pelechano, Vicente

    One of the most important challenges to be confronted in Ambient Intelligent (AmI) systems is to automate routine tasks on behalf of users. In this work, we confront this challenge presenting a novel approach based on models at runtime. This approach proposes a context-adaptive task model that allows routine tasks to be specified in an understandable way for users, facilitating their participation in the specification. These tasks are described according to context, which is specified in an ontology-based context model. Both the context model and the task model are also used at runtime. The approach provides a software infrastructure capable of automating the routine tasks as they were specified in these models by interpreting them at runtime.

  7. Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    Science.gov (United States)

    Rohloff, Kurt

    2010-01-01

    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.

  8. Automated alignment-based curation of gene models in filamentous fungi

    NARCIS (Netherlands)

    Burgt, van der A.; Severing, E.I.; Collemare, J.A.R.; Wit, de P.J.G.M.

    2014-01-01

    Background Automated gene-calling is still an error-prone process, particularly for the highly plastic genomes of fungal species. Improvement through quality control and manual curation of gene models is a time-consuming process that requires skilled biologists and is only marginally performed. The

  9. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  10. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  11. Spatial coincidence modelling, automated database updating and data consistency in vector GIS.

    NARCIS (Netherlands)

    Kufoniyi, O.

    1995-01-01

    This thesis presents formal approaches for automated database updating and consistency control in vector- structured spatial databases. To serve as a framework, a conceptual data model is formalized for the representation of geo-data from multiple map layers in which a map layer denotes a set of ter

  12. Development and Evaluation of a Model for Modular Automation in Plant Manufacturing

    Directory of Open Access Journals (Sweden)

    Uwe Katzke

    2005-08-01

    Full Text Available The benefit of modular concepts in plant automation is seen ambivalent. On one hand it offers advantages, on the other hand it also sets requirements on the system structure as well as discipline of designer. The main reasons to use modularity in systems design for automation applications in industry are reusability and reduction of complexity, but up to now modular concepts are rare in plant automation. This paper analyses the reasons and proposes measures and solution concepts. An analysis of the work flow and the working results of some companies in several branches show different proposals of modularity. These different proposals in production and process engineering are integrated in one model and represent different perspectives of an integrated system.

  13. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  14. A numerical modelling and neural network approach to estimate the impact of groundwater abstractions on river flows

    Science.gov (United States)

    Parkin, G.; Birkinshaw, S. J.; Younger, P. L.; Rao, Z.; Kirk, S.

    2007-06-01

    SummaryEvaluation of the impacts of groundwater abstractions on surface water systems is a necessary task in integrated water resources management. A range of hydrological, hydrogeological, and geomorphological factors influence the complex processes of interaction between groundwater and rivers. This paper presents an approach which uses numerical modeling of generic river-aquifer systems to represent the interaction processes, and neural networks to capture the impacts of the different controlling factors. The generic models describe hydrogeological settings representing most river-aquifer systems in England and Wales: high diffusivity (e.g. Chalk) and low diffusivity (e.g. Triassic Sandstone) aquifers with flow to rivers mediated by alluvial gravels; the same aquifers where they are in direct connection with the river; and shallow alluvial aquifers which are disconnected from regional aquifers. Numerical model simulations using the SHETRAN integrated catchment modeling system provided outputs including time-series and spatial variations in river flow depletion, and spatially distributed groundwater levels. Artificial neural network models were trained using input parameters describing the controlling factors and the outputs from the numerical model simulations, providing an efficient tool for representing the impacts of groundwater abstractions across a wide range of conditions. There are very few field data sets of accurately quantified river flow depletion as a result of groundwater abstraction under controlled conditions. One such data set from an experimental study carried out in 1967 on the Winterbourne stream in the Lambourne catchment over a Chalk aquifer was used successfully to test the modeling tool. This modeling approach provides a general methodology for rapid simulations of complex hydrogeological systems which preserves the physical consistency between multiple and diverse model outputs.

  15. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Fons

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for th

  16. Nonparametric Bayesian Modeling for Automated Database Schema Matching

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Laska, Jason A [ORNL

    2015-01-01

    The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.

  17. Efficient parallel Levenberg-Marquardt model fitting towards real-time automated parametric imaging microscopy.

    Science.gov (United States)

    Zhu, Xiang; Zhang, Dianwen

    2013-01-01

    We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetime imaging microscopy.

  18. Efficient Parallel Levenberg-Marquardt Model Fitting towards Real-Time Automated Parametric Imaging Microscopy

    OpenAIRE

    Xiang Zhu; Dianwen Zhang

    2013-01-01

    We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetim...

  19. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  20. Petri net-based modelling of human-automation conflicts in aviation.

    Science.gov (United States)

    Pizziol, Sergio; Tessier, Catherine; Dehais, Frédéric

    2014-01-01

    Analyses of aviation safety reports reveal that human-machine conflicts induced by poor automation design are remarkable precursors of accidents. A review of different crew-automation conflicting scenarios shows that they have a common denominator: the autopilot behaviour interferes with the pilot's goal regarding the flight guidance via 'hidden' mode transitions. Considering both the human operator and the machine (i.e. the autopilot or the decision functions) as agents, we propose a Petri net model of those conflicting interactions, which allows them to be detected as deadlocks in the Petri net. In order to test our Petri net model, we designed an autoflight system that was formally analysed to detect conflicting situations. We identified three conflicting situations that were integrated in an experimental scenario in a flight simulator with 10 general aviation pilots. The results showed that the conflicts that we had a-priori identified as critical had impacted the pilots' performance. Indeed, the first conflict remained unnoticed by eight participants and led to a potential collision with another aircraft. The second conflict was detected by all the participants but three of them did not manage the situation correctly. The last conflict was also detected by all the participants but provoked typical automation surprise situation as only one declared that he had understood the autopilot behaviour. These behavioural results are discussed in terms of workload and number of fired 'hidden' transitions. Eventually, this study reveals that both formal and experimental approaches are complementary to identify and assess the criticality of human-automation conflicts. Practitioner Summary: We propose a Petri net model of human-automation conflicts. An experiment was conducted with general aviation pilots performing a scenario involving three conflicting situations to test the soundness of our formal approach. This study reveals that both formal and experimental approaches

  1. MLP based Reusability Assessment Automation Model for Java based Software Systems

    Directory of Open Access Journals (Sweden)

    Surbhi Maggo

    2014-08-01

    Full Text Available Reuse refers to a common principle of using existing resources repeatedly, that is pervasively applicable everywhere. In software engineering reuse refers to the development of software systems using already available artifacts or assets partially or completely, with or without modifications. Software reuse not only promises significant improvements in productivity and quality but also provides for the development of more reliable, cost effective, dependable and less buggy (considering that prior use and testing have removed errors software with reduced time and effort. In this paper we present an efficient and reliable automation model for reusability evaluation of procedure based object oriented software for predicting the reusability levels of the components as low, medium or high. The presented model follows a reusability metric framework that targets the requisite reusability attributes including maintainability (using the Maintainability Index for functional analysis of the components. Further Multilayer perceptron (using back propagation based neural network is applied for the establishment of significant relationships among these attributes for reusability prediction. The proposed approach provides support for reusability evaluation at functional level rather than at structural level. The automation support for this approach is provided in the form of a tool named JRA2M2 (Java based Reusability Assessment Automation Model using Multilayer Perceptron (MLP, implemented in Java. The performance of JRA2M2 is recorded using parameters like accuracy, classification error, precision and recall. The results generated using JRA2M2 indicate that the proposed automation tool can be effectively used as a reliable and efficient solution for automated evaluation of reusability.

  2. AISIM (Automated Interactive Simulation Model) Training Examples Manual.

    Science.gov (United States)

    1982-02-26

    19 7 Messae rraffic Characteristics .................... 21 8 Message Traffic Matrices ........................... 22 9 Example I Model Structure...Figure 7. Message destinations are ’elected according to tne traffic matrices presenteu in Figare 8. Dalta messages each nave three destinations wnica...destination, processed and eliminated. Eaca of tnese events can be modeled by procedural operacions . Page 69 * 1.-- *,..., .*’. Aany slots circulate on

  3. Automated parametrical antenna modelling for ambient assisted living applications

    Science.gov (United States)

    Kazemzadeh, R.; John, W.; Mathis, W.

    2012-09-01

    In this paper a parametric modeling technique for a fast polynomial extraction of the physically relevant parameters of inductively coupled RFID/NFC (radio frequency identification/near field communication) antennas is presented. The polynomial model equations are obtained by means of a three-step procedure: first, full Partial Element Equivalent Circuit (PEEC) antenna models are determined by means of a number of parametric simulations within the input parameter range of a certain antenna class. Based on these models, the RLC antenna parameters are extracted in a subsequent model reduction step. Employing these parameters, polynomial equations describing the antenna parameter with respect to (w.r.t.) the overall antenna input parameter range are extracted by means of polynomial interpolation and approximation of the change of the polynomials' coefficients. The described approach is compared to the results of a reference PEEC solver with regard to accuracy and computation effort.

  4. A Local Search Modeling for Constrained Optimum Paths Problems (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Quang Dung Pham

    2009-10-01

    Full Text Available Constrained Optimum Path (COP problems appear in many real-life applications, especially on communication networks. Some of these problems have been considered and solved by specific techniques which are usually difficult to extend. In this paper, we introduce a novel local search modeling for solving some COPs by local search. The modeling features the compositionality, modularity, reuse and strengthens the benefits of Constrained-Based Local Search. We also apply the modeling to the edge-disjoint paths problem (EDP. We show that side constraints can easily be added in the model. Computational results show the significance of the approach.

  5. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  6. Fast, Automated, 3D Modeling of Building Interiors

    Science.gov (United States)

    2012-10-30

    Cheng, M. Anderson, S. He, A. Zakhor, "Texture Mapping 3D Planar Models of Indoor Environments with Noisy Camera Poses," SPIE electronic imaging...successfully process noisy scans with non-zero registration error. Most of the processing is performed after a dramatic dimensionality reduction, yielding a...lobby and hallways of a hotel .  Applying textures to these models is an important step in generating photorealistic visualizations of data

  7. Automatic Model-Based Generation of Parameterized Test Cases Using Data Abstraction

    NARCIS (Netherlands)

    Calamé, Jens R.; Ioustinova, Natalia; Pol, van de Jaco; Romijn, J.M.T.; Smith, G.; Pol, van de J.C.

    2007-01-01

    Developing test suites is a costly and error-prone process. Model-based test generation tools facilitate this process by automatically generating test cases from system models. The applicability of these tools, however, depends on the size of the target systems. Here, we propose an approach to gener

  8. A "Brutus" model checking of a spi-calculus dialect (Extended Abstract)

    NARCIS (Netherlands)

    Gnesi, S.; Latella, D.; Lenzini, G.

    2000-01-01

    This paper proposes a preliminary framework in which protocols, expressed in a dialect of the spi-calculus, can be verified using model checking algorithms. In particular we define a formal semantics for a dialect of the spi-calculus based on labeled transition systems in such a way that the model c

  9. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    Science.gov (United States)

    Roy, Gareth; Crooks, David; Mertens, Lena; Mitchell, Mark; Purdie, Stuart; Cadellin Skipsey, Samuel; Britton, David

    2014-06-01

    With the current trend towards "On Demand Computing" in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on "off the shelf" software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  10. Selected Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Elasticity of Vague Language Abstract:This paper develops an overarching theoretical framework for dealing with the strategic elastic- ity of vague language. Employing the metaphor of a slingshot, it describes how the vague language (VL) is stretched to meet the needs of communication. Drawing attention to the positive and effective role played by VL, the study further looks into how the VL theory is derived from real - life data of ten- sion -prone encounters. The empirical evidence validates the theory~ main maxim and its four specific maxims, and lends support to the following findings: 1 ) the pragmatic functions which VL performs, their linguistic realizations, and the pragmatic maxims they conform to are interconnected; 2) the dom- inant factor in the functioning of VL is the communicative goal; and 3 ) stretching on a continuum of polarities, between soft and tough, firm and flexible, cooperative and uncooperative, shows the versa- tility and elasticity of VL. An important implication of this study is that while VL~ vagueness is context -governed and culture- denendent it~ nll A ~1.o~;.;~.. ; :

  11. Towards automated model calibration and validation in rail transit simulation

    NARCIS (Netherlands)

    Huang, Y.; Seck, M.D.; Verbraeck, A.

    2012-01-01

    The benefit of modeling and simulation in rail transit operations has been demonstrated in various studies. However, the complex dynamics involved and the ever-changing environment in which rail systems evolve expose the limits of classical simulation. Changing environmental conditions and second or

  12. A simplified cellular automation model for city traffic

    Energy Technology Data Exchange (ETDEWEB)

    Simon, P.M.; Nagel, K. [Los Alamos National Lab., NM (United States)]|[Santa Fe Inst., NM (United States)

    1997-12-31

    The authors systematically investigate the effect of blockage sites in a cellular automata model for traffic flow. Different scheduling schemes for the blockage sites are considered. None of them returns a linear relationship between the fraction of green time and the throughput. The authors use this information for a fast implementation of traffic in Dallas.

  13. Automated volumetric breast density derived by shape and appearance modeling

    Science.gov (United States)

    Malkov, Serghei; Kerlikowske, Karla; Shepherd, John

    2014-03-01

    The image shape and texture (appearance) estimation designed for facial recognition is a novel and promising approach for application in breast imaging. The purpose of this study was to apply a shape and appearance model to automatically estimate percent breast fibroglandular volume (%FGV) using digital mammograms. We built a shape and appearance model using 2000 full-field digital mammograms from the San Francisco Mammography Registry with known %FGV measured by single energy absorptiometry method. An affine transformation was used to remove rotation, translation and scale. Principal Component Analysis (PCA) was applied to extract significant and uncorrelated components of %FGV. To build an appearance model, we transformed the breast images into the mean texture image by piecewise linear image transformation. Using PCA the image pixels grey-scale values were converted into a reduced set of the shape and texture features. The stepwise regression with forward selection and backward elimination was used to estimate the outcome %FGV with shape and appearance features and other system parameters. The shape and appearance scores were found to correlate moderately to breast %FGV, dense tissue volume and actual breast volume, body mass index (BMI) and age. The highest Pearson correlation coefficient was equal 0.77 for the first shape PCA component and actual breast volume. The stepwise regression method with ten-fold cross-validation to predict %FGV from shape and appearance variables and other system outcome parameters generated a model with a correlation of r2 = 0.8. In conclusion, a shape and appearance model demonstrated excellent feasibility to extract variables useful for automatic %FGV estimation. Further exploring and testing of this approach is warranted.

  14. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  15. Using Feature Modelling and Automations to Select among Cloud Solutions

    OpenAIRE

    Quinton, Clément; Duchien, Laurence; Heymans, patrick; Mouton, Stéphane; Charlier, Etienne

    2012-01-01

    International audience; Cloud computing is a major trend in distributed computing environments. Resources are accessed on demand by customers and are delivered as services by cloud providers in a pay-per-use model. Companies provide their applications as services and rely on cloud providers to provision, host and manage such applications on top of their infrastructure. However, the wide range of cloud solutions and the lack of knowledge in this domain is a real problem for companies when faci...

  16. Automated soil resources mapping based on decision tree and Bayesian predictive modeling

    Institute of Scientific and Technical Information of China (English)

    周斌; 张新刚; 王人潮

    2004-01-01

    This article presents two approaches for automated building of knowledge bases of soil resources mapping.These methods used decision tree and Bayesian predictive modeling, respectively to generate knowledge from training data.With these methods, building a knowledge base for automated soil mapping is easier than using the conventional knowledge acquisition approach. The knowledge bases built by these two methods were used by the knowledge classifier for soil type classification of the Longyou area, Zhejiang Province, China using TM hi-temporal imageries and GIS data. To evaluate the performance of the resultant knowledge bases, the classification results were compared to existing soil map based on field survey. The accuracy assessment and analysis of the resultant soil maps suggested that the knowledge bases built by these two methods were of good quality for mapping distribution model of soil classes over the study area.

  17. Automated soil resources mapping based on decision tree and Bayesian predictive modeling

    Institute of Scientific and Technical Information of China (English)

    周斌; 张新刚; 王人潮

    2004-01-01

    This article presents two approaches for automated building of knowledge bases of soil resources mapping.These methods used decision tree and Bayesian predictive modeling,respectively to generate knowledge from training data.With these methods,building a knowledge base for automated soil mapping is easier than using the conventional knowledge acquisition approach.The knowledge bases built by these two methods were used by the knowledge classifier for soil type classification of the Longyou area,Zhejiang Province,China using TM bi-temporal imageries and GIS data.To evaluate the performance of the resultant knowledge bases,the classification results were compared to existing soil map based on field survey.The accuracy assessment and analysis of the resultant soil maps suggested that the knowledge bases built by these two methods were of good quality for mapping distribution model of soil classes over the study area.

  18. Modelling Venting and Pressure Build-up in a 18650 LCO Cell during Thermal Runaway (ABSTRACT)

    DEFF Research Database (Denmark)

    Coman, Paul Tiberiu; Veje, Christian; White, Ralph

    reactions in the anode, cathode and SEI, but also in electrochemical reactions and boiling of the electrolyte is developed for a cylindrical 18650 LCO cell (Lithium Cobalt Oxide). For determining the pressure and the temperature after venting, the isentropic flow equations are included in the model...

  19. AN ACCURACY ASSESSMENT OF AUTOMATED PHOTOGRAMMETRIC TECHNIQUES FOR 3D MODELING OF COMPLEX INTERIORS

    OpenAIRE

    Georgantas, A.; M. Brédif; Pierrot-Desseilligny, M.

    2012-01-01

    This paper presents a comparison of automatic photogrammetric techniques to terrestrial laser scanning for 3D modelling of complex interior spaces. We try to evaluate the automated photogrammetric techniques not only in terms of their geometric quality compared to laser scanning but also in terms of cost in money, acquisition and computational time. To this purpose we chose as test site a modern building’s stairway. APERO/MICMAC ( ©IGN )which is an Open Source photogrammetric softwar...

  20. Automated EEG monitoring in defining a chronic epilepsy model.

    Science.gov (United States)

    Mascott, C R; Gotman, J; Beaudet, A

    1994-01-01

    There has been a recent surge of interest in chronic animal models of epilepsy. Proper assessment of these models requires documentation of spontaneous seizures by EEG, observation, or both in each individual animal to confirm the presumed epileptic condition. We used the same automatic seizure detection system as that currently used for patients in our institution and many others. Electrodes were implanted in 43 rats before intraamygdalar administration of kainic acid (KA). Animals were monitored intermittently for 3 months. Nine of the rats were protected by anticonvulsants [pentobarbital (PB) and diazepam (DZP)] at the time of KA injection. Between 1 and 3 months after KA injection, spontaneous seizures were detected in 20 of the 34 unprotected animals (59%). Surprisingly, spontaneous seizures were also detected during the same period in 2 of the 9 protected animals that were intended to serve as nonepileptic controls. Although the absence of confirmed spontaneous seizures in the remaining animals cannot exclude their occurrence, it indicates that, if present, they are at least rare. On the other hand, definitive proof of epilepsy is invaluable in the attempt to interpret pathologic data from experimental brains.

  1. Emergence of Consensus in a Multi-Robot Network: from Abstract Models to Empirical Validation

    CERN Document Server

    Trianni, Vito; Reina, Andreagiovanni; Baronchelli, Andrea

    2016-01-01

    Consensus dynamics in decentralised multiagent systems are subject to intense studies, and several different models have been proposed and analysed. Among these, the naming game stands out for its simplicity and applicability to a wide range of phenomena and applications, from semiotics to engineering. Despite the wide range of studies available, the implementation of theoretical models in real distributed systems is not always straightforward, as the physical platform imposes several constraints that may have a bearing on the consensus dynamics. In this paper, we investigate the effects of an implementation of the naming game for the kilobot robotic platform, in which we consider concurrent execution of games and physical interferences. Consensus dynamics are analysed in the light of the continuously evolving communication network created by the robots, highlighting how the different regimes crucially depend on the robot density and on their ability to spread widely in the experimental arena. We find that ph...

  2. An Abstract Data Model for the IDEF0 Graphical Analysis Language

    Science.gov (United States)

    1990-01-11

    models to be constructed. One approach to reducing such ambiguity is to replace or augment free-form text with a more syntactically defined data...whatever level was necessary to ensure an unambiguous interpretation of the system require- ments. Marca and McGowan have written an excellent book which...Manufacturing (ICAM), which is directed towards increasing manufacturing productivity via computer technology, defined a subset of Ross’ Structured Analysis

  3. Models for identification of erroneous atom-to-atom mapping of reactions performed by automated algorithms.

    Science.gov (United States)

    Muller, Christophe; Marcou, Gilles; Horvath, Dragos; Aires-de-Sousa, João; Varnek, Alexandre

    2012-12-21

    Machine learning (SVM and JRip rule learner) methods have been used in conjunction with the Condensed Graph of Reaction (CGR) approach to identify errors in the atom-to-atom mapping of chemical reactions produced by an automated mapping tool by ChemAxon. The modeling has been performed on the three first enzymatic classes of metabolic reactions from the KEGG database. Each reaction has been converted into a CGR representing a pseudomolecule with conventional (single, double, aromatic, etc.) bonds and dynamic bonds characterizing chemical transformations. The ChemAxon tool was used to automatically detect the matching atom pairs in reagents and products. These automated mappings were analyzed by the human expert and classified as "correct" or "wrong". ISIDA fragment descriptors generated for CGRs for both correct and wrong mappings were used as attributes in machine learning. The learned models have been validated in n-fold cross-validation on the training set followed by a challenge to detect correct and wrong mappings within an external test set of reactions, never used for learning. Results show that both SVM and JRip models detect most of the wrongly mapped reactions. We believe that this approach could be used to identify erroneous atom-to-atom mapping performed by any automated algorithm.

  4. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  5. Poster Abstract: A Practical Model for Human-Smart Appliances Interaction

    DEFF Research Database (Denmark)

    Fürst, Jonathan; Fruergaard, Andreas; Johannesen, Marco Høvinghof

    2016-01-01

    interaction with smart appliances. Two issues stand out: (1) How to impose logical locality when interacting with a smart appliance? (2) How to mediate conflicts between several persons in a room, or between building-wide policies and user preferences? We approach both problems by defining a general model...... for human-smart appliance interaction. We present a prototype implementation with an off-the-shelf smart lighting and heating system in a shared office space. Our approach minimizes the need for location metadata. It relies on a human-feedback loop (both sensor based and manual) to identify the optimal...

  6. Simulation modeling of an automated material storage/retrieval system. [GPSS

    Energy Technology Data Exchange (ETDEWEB)

    Gallegos, J.

    1978-03-01

    A computer simulation model is presented for an automated material storage-and-retrieval system that utilizes bin loading. The model is written in General Purpose Simulation System (GPSS) language, and it functionally simulates incoming job-arrival rates, operator task times, and the travel times of the storage/retrieval vehicle. The model is utilized to evaluate the impact of employing different types of storage policies, the duration of wait times encountered in a priority batch queue, and the expected throughput capacity of the system under either maximum-input or peak-period conditions.

  7. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  8. Modeling Physical Processes at the Nanoscale—Insight into Self-Organization of Small Systems (abstract)

    Science.gov (United States)

    Proykova, Ana

    2009-04-01

    Essential contributions have been made in the field of finite-size systems of ingredients interacting with potentials of various ranges. Theoretical simulations have revealed peculiar size effects on stability, ground state structure, phases, and phase transformation of systems confined in space and time. Models developed in the field of pure physics (atomic and molecular clusters) have been extended and successfully transferred to finite-size systems that seem very different—small-scale financial markets, autoimmune reactions, and social group reactions to advertisements. The models show that small-scale markets diverge unexpectedly fast as a result of small fluctuations; autoimmune reactions are sequences of two discontinuous phase transitions; and social groups possess critical behavior (social percolation) under the influence of an external field (advertisement). Some predicted size-dependent properties have been experimentally observed. These findings lead to the hypothesis that restrictions on an object's size determine the object's total internal (configuration) and external (environmental) interactions. Since phases are emergent phenomena produced by self-organization of a large number of particles, the occurrence of a phase in a system containing a small number of ingredients is remarkable.

  9. Modelling of series of types of automated trenchless works tunneling

    Science.gov (United States)

    Gendarz, P.; Rzasinski, R.

    2016-08-01

    Microtunneling is the newest method for making underground installations. Show method is the result of experience and methods applied in other, previous methods of trenchless underground works. It is considered reasonable to elaborate a series of types of construction of tunneling machines, to develop this particular earthworks method. There are many design solutions of machines, but the current goal is to develop non - excavation robotized machine. Erosion machines with main dimensions of the tunnels which are: 1600, 2000, 2500, 3150 are design with use of the computer aided methods. Series of types of construction of tunneling machines creating process was preceded by analysis of current state. The verification of practical methodology of creating the systematic part series was based on the designed erosion machines series of types. There were developed: method of construction similarity of the erosion machines, algorithmic methods of quantitative construction attributes variant analyzes in the I-DEAS advanced graphical program, relational and program parameterization. There manufacturing process of the parts will be created, which allows to verify the technological process on the CNC machines. The models of designed will be modified and the construction will be consulted with erosion machine users and manufacturers like: Tauber Rohrbau GmbH & Co.KG from Minster, OHL ZS a.s. from Brna,. The companies’ acceptance will result in practical verification by JUMARPOL company.

  10. An Automated Planning Model for RoF Heterogeneous Wireless Networks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Bergheim, Hans; Ragnarsson, Ólafur;

    2010-01-01

    The number of users in wireless WANs is increasing like never before, at the same time as the bandwidth demands by users increase.The structure of the third generation Wireless WANs makes it expensive for Wireless ISPs to meet these demands.The FUTON architecture is a RoF heterogeneous wireless...... network architecture under development,that will be cheaper to deploy and operate.This paper shows a method to plan an implementation of this architecture.The planning is done as automatic as possible,covering radio planning, fiber planning and network dimensioning. The out come of the paper is a planning...... process that uses GIS-data to automate planning for the entire architecture.The automated model uses a collection of scripts that can easily be modified for planning a FUTON architecture anywhere. The scripts are made using functions for the different tasks, inorder to make them easy to extend and modify....

  11. Equational Abstractions

    Science.gov (United States)

    2007-01-01

    Uribe for many useful discussions that have influenced the ideas presented here, Manuel Clavel and Francisco Durán for their help in the preparation of...Grumberg, and Doron A. Peled. Model Checking. MIT Press, 1999. 11. Manuel Clavel , Francisco Durán, Steven Eker, Patrick Lincoln, Narciso Martı́-Oliet...Manuel Clavel , Francisco Durán, Steven Eker, Patrick Lincoln, Narciso Martı́-Oliet, José Meseguer, and Carolyn L. Talcott. All About Maude, A High

  12. Automated classification of atherosclerotic plaque from magnetic resonance images using predictive models.

    Science.gov (United States)

    Anderson, Russell W; Stomberg, Christopher; Hahm, Charles W; Mani, Venkatesh; Samber, Daniel D; Itskovich, Vitalii V; Valera-Guallar, Laura; Fallon, John T; Nedanov, Pavel B; Huizenga, Joel; Fayad, Zahi A

    2007-01-01

    The information contained within multicontrast magnetic resonance images (MRI) promises to improve tissue classification accuracy, once appropriately analyzed. Predictive models capture relationships empirically, from known outcomes thereby combining pattern classification with experience. In this study, we examine the applicability of predictive modeling for atherosclerotic plaque component classification of multicontrast ex vivo MR images using stained, histopathological sections as ground truth. Ten multicontrast images from seven human coronary artery specimens were obtained on a 9.4 T imaging system using multicontrast-weighted fast spin-echo (T1-, proton density-, and T2-weighted) imaging with 39-mum isotropic voxel size. Following initial data transformations, predictive modeling focused on automating the identification of specimen's plaque, lipid, and media. The outputs of these three models were used to calculate statistics such as total plaque burden and the ratio of hard plaque (fibrous tissue) to lipid. Both logistic regression and an artificial neural network model (Relevant Input Processor Network-RIPNet) were used for predictive modeling. When compared against segmentation resulting from cluster analysis, the RIPNet models performed between 25 and 30% better in absolute terms. This translates to a 50% higher true positive rate over given levels of false positives. This work indicates that it is feasible to build an automated system of plaque detection using MRI and data mining.

  13. MAIN ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Reflection on Some Issues Regarding the System of Socialism with Chinese Characteristics Zhang Xingmao The establishment of the system of socialism with Chinese characteristics, as the symbol of China's entry into the socialist society with Chinese characteristics, is the significant development of Marist theory of social formation. The Chinese model is framed and defined by the socialist system with Chinese characteristics, therefore the study of different levels and aspects of the Chinese model should be related to the relevant Chinese system to guarantee a scientific interpretation. Under the fundamental system of socialism, the historical and logical starting point of the formation of socialism with Chinese characteristics lies in eliminating the private ownership first and then allowing the existence and rapid development of the non-public sectors of the economy. With the gradual establishment and on the basis of the basic economic system in the preliminary stage of Socialism, and with the adaptive adjustments in the economic, political, cultural, and social systems, the socialist system with Chinese characteristics is gradually formed.

  14. An Ontology-based Model to Determine the Automation Level of an Automated Vehicle for Co-Driving

    OpenAIRE

    POLLARD, Evangeline; Morignot, Philippe; Fawzi NASHASHIBI

    2013-01-01

    International audience; Full autonomy of ground vehicles is a major goal of the ITS (Intelligent Transportation Systems) community. However, reaching such highest autonomy level in all situations (weather, traffic, . . . ) may seem difficult in practice, despite recent results regarding driverless cars (e.g., Google Cars). In addition, an automated vehicle should also self-assess its own perception abilities, and not only perceive its environment. In this paper, we propose an intermediate app...

  15. A Model-Based Analysis of Semi-Automated Data Discovery and Entry Using Automated Content Extraction

    Science.gov (United States)

    2011-02-01

    10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release... distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as...all sentences S = number of sentences across all documents WSa = words per sentence containing a relation for SW WPa = words per paragraph

  16. Towards automated software model checking using graph transformation systems and Bogor

    Institute of Scientific and Technical Information of China (English)

    Vahid RAFE; Adel T.RAHMANI

    2009-01-01

    Graph transformation systems have become a general formal modeling language to describe many models in software development process. Behavioral modeling of dynamic systems and model-to-model transformations are only a few examples in which graphs have been used to software development. But even the perfect graph transformation system must be equipped with automated analysis capabilities to let users understand whether such a formal specification fulfills their requirements. In this paper,we present a new solution to verify graph transformation systems using the Bogor model checker. The attributed graph grammars (AGG)-Iike graph transformation systems are translated to Bandera intermediate representation (BIR), the input language of Bogor,and Bogor verifies the model against some interesting properties defined by combining linear temporal logic (LTL) and special-purpose graph rules. Experimental results are encouraging, showing that in most cases oar solution improves existing approaches in terms of both performance and expressiveness.

  17. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  18. Unifying Abstractions

    DEFF Research Database (Denmark)

    Torgersen, Mads

    of the most complex type relations put forth in type systems research, without compromising such fundamental qualities as conceptuality, modularity and static typing. While many new constructs and unifications are put forth  to substantiate their conceptual validity, type rules are given to support......This thesis presents the RUNE language, a semantic construction of related and tightly coupled programming constructs presented in the shape of a programming language. The major contribution is the succesfull design of a highly unified and general programming model, capable of expressing some...... their typeability and examples are described to demonstrate their use. Novel constructs include a parallel approach to object generation, and a blend of structural and nominal subtyping, while a very general class construct integrates the notions of procedures, parameterisation and genericity, and provides...

  19. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    Science.gov (United States)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  20. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    Science.gov (United States)

    Mao, Y.; Ye, A.; Xu, J.; Ma, F.; Deng, X.; Miao, C.; Gong, W.; Di, Z.

    2014-07-01

    A high-resolution and high-accuracy drainage network map is a prerequisite for simulating the water cycle in land surface hydrological models. The objective of this study was to develop a new automated extraction of drainage network model, which can get high-precision continuous drainage network on high-resolution DEM (Digital Elevation Model). The high-resolution DEM need too much computer resources to extract drainage network. The conventional GIS method often can not complete to calculate on high-resolution DEM of big basins, because the number of grids is too large. In order to decrease the computation time, an advanced distributed automated extraction of drainage network model (Adam) was proposed in the study. The Adam model has two features: (1) searching upward from outlet of basin instead of sink filling, (2) dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM) at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales).

  1. Automated modelling of spatially-distributed glacier ice thickness and volume

    Science.gov (United States)

    James, William H. M.; Carrivick, Jonathan L.

    2016-07-01

    Ice thickness distribution and volume are both key parameters for glaciological and hydrological applications. This study presents VOLTA (Volume and Topography Automation), which is a Python script tool for ArcGISTM that requires just a digital elevation model (DEM) and glacier outline(s) to model distributed ice thickness, volume and bed topography. Ice thickness is initially estimated at points along an automatically generated centreline network based on the perfect-plasticity rheology assumption, taking into account a valley side drag component of the force balance equation. Distributed ice thickness is subsequently interpolated using a glaciologically correct algorithm. For five glaciers with independent field-measured bed topography, VOLTA modelled volumes were between 26.5% (underestimate) and 16.6% (overestimate) of that derived from field observations. Greatest differences were where an asymmetric valley cross section shape was present or where significant valley infill had occurred. Compared with other methods of modelling ice thickness and volume, key advantages of VOLTA are: a fully automated approach and a user friendly graphical user interface (GUI), GIS consistent geometry, fully automated centreline generation, inclusion of a side drag component in the force balance equation, estimation of glacier basal shear stress for each individual glacier, fully distributed ice thickness output and the ability to process multiple glaciers rapidly. VOLTA is capable of regional scale ice volume assessment, which is a key parameter for exploring glacier response to climate change. VOLTA also permits subtraction of modelled ice thickness from the input surface elevation to produce an ice-free DEM, which is a key input for reconstruction of former glaciers. VOLTA could assist with prediction of future glacier geometry changes and hence in projection of future meltwater fluxes.

  2. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  3. An approach for model-based energy cost analysis of industrial automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Beck, A.; Goehner, P. [Institute of Industrial Automation and Software Engineering, University of Stuttgart, Pfaffenwaldring 47, 70550 Stuttgart (Germany)

    2012-08-15

    Current energy reports confirm the steadily dilating gap between available conventional energy resources and future energy demand. This gap results in increasing energy costs and has become a determining factor in economies. Hence, politics, industry, and research focus either on regenerative energy resources or on energy-efficient concepts, methods, and technologies for energy-consuming devices. A remaining challenge is energy optimization of complex systems during their operation time. In addition to optimization measures that can be applied in development and engineering, the generation of optimization measures that are customized to the specific dynamic operational situation, promise high-cost saving potentials. During operation time, the systems are located in unique situations and environments and are operated according to individual requirements of their users. Hence, in addition to complexity of the systems, individuality and dynamic variability of their surroundings during operation time complicate identification of goal-oriented optimization measures. This contribution introduces a model-based approach for user-centric energy cost analysis of industrial automation systems. The approach allows automated generation and appliance of individual optimization proposals. Focus of this paper is on a basic variant for a single industrial automation system and its operational parameters.

  4. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  5. A Framework for Automated Spine and Vertebrae Interpolation-Based Detection and Model-Based Segmentation.

    Science.gov (United States)

    Korez, Robert; Ibragimov, Bulat; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2015-08-01

    Automated and semi-automated detection and segmentation of spinal and vertebral structures from computed tomography (CT) images is a challenging task due to a relatively high degree of anatomical complexity, presence of unclear boundaries and articulation of vertebrae with each other, as well as due to insufficient image spatial resolution, partial volume effects, presence of image artifacts, intensity variations and low signal-to-noise ratio. In this paper, we describe a novel framework for automated spine and vertebrae detection and segmentation from 3-D CT images. A novel optimization technique based on interpolation theory is applied to detect the location of the whole spine in the 3-D image and, using the obtained location of the whole spine, to further detect the location of individual vertebrae within the spinal column. The obtained vertebra detection results represent a robust and accurate initialization for the subsequent segmentation of individual vertebrae, which is performed by an improved shape-constrained deformable model approach. The framework was evaluated on two publicly available CT spine image databases of 50 lumbar and 170 thoracolumbar vertebrae. Quantitative comparison against corresponding reference vertebra segmentations yielded an overall mean centroid-to-centroid distance of 1.1 mm and Dice coefficient of 83.6% for vertebra detection, and an overall mean symmetric surface distance of 0.3 mm and Dice coefficient of 94.6% for vertebra segmentation. The results indicate that by applying the proposed automated detection and segmentation framework, vertebrae can be successfully detected and accurately segmented in 3-D from CT spine images.

  6. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  7. Intelligent sensor-model automated control of PMR-15 autoclave processing

    Science.gov (United States)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    1992-01-01

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  8. Electronic design automation of analog ICs combining gradient models with multi-objective evolutionary algorithms

    CERN Document Server

    Rocha, Frederico AE; Lourenço, Nuno CC; Horta, Nuno CG

    2013-01-01

    This book applies to the scientific area of electronic design automation (EDA) and addresses the automatic sizing of analog integrated circuits (ICs). Particularly, this book presents an approach to enhance a state-of-the-art layout-aware circuit-level optimizer (GENOM-POF), by embedding statistical knowledge from an automatically generated gradient model into the multi-objective multi-constraint optimization kernel based on the NSGA-II algorithm. The results showed allow the designer to explore the different trade-offs of the solution space, both through the achieved device sizes, or the resp

  9. A Computational Approach for Automated Posturing of a Human Finite Element Model

    Science.gov (United States)

    2016-07-01

    connection with con- tract/instrument W911QX–14–C–0016 with the US Army Research Laboratory. The views and conclusions contained in this document are...comply with a collection of  information  if  it does not display a  currently valid OMB  control  number.  PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE...Std. Z39.18 July 2016 Memorandum Report A Computational Approach for Automated Posturing of a Human Finite Element Model Justin McKee and Adam

  10. Three Applications of Automated Test Assembly within a User-Friendly Modeling Environment

    Directory of Open Access Journals (Sweden)

    Ken Cor

    2009-06-01

    Full Text Available While linear programming is a common tool in business and industry, there have not been many applications in educational assessment and only a handful of individuals have been actively involved in conducting psychometric research in this area. Perhaps this is due, at least in part, to the complexity of existing software packages. This article presents three applications of linear programming to automate test assembly using an add-in to Microsoft Excel 2007. These increasingly complex examples permit the reader to readily see and manipulate the programming objectives and constraints within a familiar modeling environment. A spreadsheet used in this demonstration is available for downloading.

  11. An automated model-based aim point distribution system for solar towers

    Science.gov (United States)

    Schwarzbözl, Peter; Rong, Amadeus; Macke, Ansgar; Säck, Jan-Peter; Ulmer, Steffen

    2016-05-01

    Distribution of heliostat aim points is a major task during central receiver operation, as the flux distribution produced by the heliostats varies continuously with time. Known methods for aim point distribution are mostly based on simple aim point patterns and focus on control strategies to meet local temperature and flux limits of the receiver. Lowering the peak flux on the receiver to avoid hot spots and maximizing thermal output are obviously competing targets that call for a comprehensive optimization process. This paper presents a model-based method for online aim point optimization that includes the current heliostat field mirror quality derived through an automated deflectometric measurement process.

  12. Fully Automated Non-Native Speech Recognition Using Confusion-Based Acoustic Model Integration

    OpenAIRE

    Bouselmi, Ghazi; Fohr, Dominique; Illina, Irina; Haton, Jean-Paul

    2005-01-01

    This paper presents a fully automated approach for the recognition of non-native speech based on acoustic model modification. For a native language (L1) and a spoken language (L2), pronunciation variants of the phones of L2 are automatically extracted from an existing non-native database as a confusion matrix with sequences of phones of L1. This is done using L1's and L2's ASR systems. This confusion concept deals with the problem of non existence of match between some L2 and L1 phones. The c...

  13. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...... with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...

  14. Automated Translation and Thermal Zoning of Digital Building Models for Energy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Nathaniel L. [Cornell University; McCrone, Colin J. [Cornell University; Walter, Bruce J. [Cornell University; Pratt, Kevin B. [Cornell University; Greenberg, Donald P. [Cornell University

    2013-08-26

    Building energy simulation is valuable during the early stages of design, when decisions can have the greatest impact on energy performance. However, preparing digital design models for building energy simulation typically requires tedious manual alteration. This paper describes a series of five automated steps to translate geometric data from an unzoned CAD model into a multi-zone building energy model. First, CAD input is interpreted as geometric surfaces with materials. Second, surface pairs defining walls of various thicknesses are identified. Third, normal directions of unpaired surfaces are determined. Fourth, space boundaries are defined. Fifth, optionally, settings from previous simulations are applied, and spaces are aggregated into a smaller number of thermal zones. Building energy models created quickly using this method can offer guidance throughout the design process.

  15. Automated brain structure segmentation based on atlas registration and appearance models

    DEFF Research Database (Denmark)

    van der Lijn, Fedde; de Bruijne, Marleen; Klein, Stefan;

    2012-01-01

    Accurate automated brain structure segmentation methods facilitate the analysis of large-scale neuroimaging studies. This work describes a novel method for brain structure segmentation in magnetic resonance images that combines information about a structure’s location and appearance. The spatial...... model is implemented by registering multiple atlas images to the target image and creating a spatial probability map. The structure’s appearance is modeled by a classi¿er based on Gaussian scale-space features. These components are combined with a regularization term in a Bayesian framework...... that is globally optimized using graph cuts. The incorporation of the appearance model enables the method to segment structures with complex intensity distributions and increases its robustness against errors in the spatial model. The method is tested in cross-validation experiments on two datasets acquired...

  16. Automated Behavioral Phenotyping Reveals Presymptomatic Alterations in a SCA3 Genetrap Mouse Model

    Institute of Scientific and Technical Information of China (English)

    Jeannette Hübener; Nicolas Casadei; Peter Teismann; Mathias W. Seeliger; Maria Bj(o)rkqvist; Stephan von H(o)rsten; Olaf Riess; Huu Phuc Nguyen

    2012-01-01

    Characterization of disease models of neurodegenerative disorders requires a systematic and comprehensive phenotyping in a highly standardized manner,Therefore,automated high-resolution behavior test systems such as the homecage based LabMaster system are of particular interest.We demonstrate the power of the automated LabMaster system by discovering previously unrecognized features of a recently characterized atxn3 mutant mouse model.This model provided neurological symptoms including gait ataxia,tremor,weight loss and premature death at the age of t2 months usually detectable just 2 weeks before the mice died.Moreover,using the LabMaster system we were able to detect hypoactivity in presymptomatic mutant mice in the dark as well as light phase.Additionally,we analyzed inflammation,immunological and hematological parameters,which indicated a reduced immune defense in phenotypic mice.Here we demonstrate thai a detailed characterization even of organ systems that are usually not affected in SCA3 is important for further studies of pathogenesis and required for the preclinical therapeutic studies.

  17. ON XML AND Automation APIs-BASED CAD MODEL INTEGRATION AND EXCHANGE%基于XML和Automation APIs的CAD模型集成与交换研究

    Institute of Scientific and Technical Information of China (English)

    陈曼华

    2011-01-01

    异构数据源(例如不同CAD系统之间)的交换和共享是虚拟产品开发中信息集成的核心.一般是采用基于设计历史参数化建模方法进行模型交换,但特征和几何信息的交换导致了转换器规模变大;另外转换器采用过程模型作为中性文件,需要几何建模内核生成内部显式模型.为了解决该问题,提出了一种称之为MidCAD的共享集成平台,该平台将转换器和中性文件分开,这样商业CAD系统的转换器只和MidCAD进行交互.采用Microsoft开发的Automation APIs实现MidCAD的交互通信.转换器利用MidCAD的Automation APIs将源CAD模型转换为XML宏文件,或将XML宏文件转换为目标CAD系统的模型.对MidCAD集成平台进行了验证.%The exchange and sharing among heterogeneous data sources ( such as aroong different CAD systems) are the core of information integration in virtual product development.Generally, the design history-based parametric modelling approach is often adopted to exchange models, however, the size of each translator tends to be large caused by the exchanges of the feature and geometric information; meanwhile,it is necessary to generate an internal explicit model for geometric modelling kernel when the translators use procedural model as neutral file.To overcome the above-mentioned problems, this study presents a so-called MidCAD sharing integration platform.The proposed platform separates the translator from the neutral file so that the translators of different commercial CAD systems interact only with the MidCAD.This study adopts Automation APIs developed by Microsoft to implement interactive communication between MidCAD and commercial CAD systems.By using Automation APIs of MidCAD, the translator converts source CAD model into XML macro file, or converts XML macro file into target CAD system model.The MidCAD integration platform has been verified by a simple example.

  18. A conceptual model of the automated credibility assessment of the volunteered geographic information

    Science.gov (United States)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  19. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    R. Schreiner

    2001-06-27

    The purpose of this work is to develop the Engineered Barrier System (EBS) radionuclide transport abstraction model, as directed by a written development plan (CRWMS M&O 1999a). This abstraction is the conceptual model that will be used to determine the rate of release of radionuclides from the EBS to the unsaturated zone (UZ) in the total system performance assessment-license application (TSPA-LA). In particular, this model will be used to quantify the time-dependent radionuclide releases from a failed waste package (WP) and their subsequent transport through the EBS to the emplacement drift wall/UZ interface. The development of this conceptual model will allow Performance Assessment Operations (PAO) and its Engineered Barrier Performance Department to provide a more detailed and complete EBS flow and transport abstraction. The results from this conceptual model will allow PA0 to address portions of the key technical issues (KTIs) presented in three NRC Issue Resolution Status Reports (IRSRs): (1) the Evolution of the Near-Field Environment (ENFE), Revision 2 (NRC 1999a), (2) the Container Life and Source Term (CLST), Revision 2 (NRC 1999b), and (3) the Thermal Effects on Flow (TEF), Revision 1 (NRC 1998). The conceptual model for flow and transport in the EBS will be referred to as the ''EBS RT Abstraction'' in this analysis/modeling report (AMR). The scope of this abstraction and report is limited to flow and transport processes. More specifically, this AMR does not discuss elements of the TSPA-SR and TSPA-LA that relate to the EBS but are discussed in other AMRs. These elements include corrosion processes, radionuclide solubility limits, waste form dissolution rates and concentrations of colloidal particles that are generally represented as boundary conditions or input parameters for the EBS RT Abstraction. In effect, this AMR provides the algorithms for transporting radionuclides using the flow geometry and radionuclide concentrations

  20. Towards Composable Concurrency Abstractions

    Directory of Open Access Journals (Sweden)

    Janwillem Swalens

    2014-06-01

    Full Text Available In the past decades, many different programming models for managing concurrency in applications have been proposed, such as the actor model, Communicating Sequential Processes, and Software Transactional Memory. The ubiquity of multi-core processors has made harnessing concurrency even more important. We observe that modern languages, such as Scala, Clojure, or F#, provide not one, but multiple concurrency models that help developers manage concurrency. Large end-user applications are rarely built using just a single concurrency model. Programmers need to manage a responsive UI, deal with file or network I/O, asynchronous workflows, and shared resources. Different concurrency models facilitate different requirements. This raises the issue of how these concurrency models interact, and whether they are composable. After all, combining different concurrency models may lead to subtle bugs or inconsistencies. In this paper, we perform an in-depth study of the concurrency abstractions provided by the Clojure language. We study all pairwise combinations of the abstractions, noting which ones compose without issues, and which do not. We make an attempt to abstract from the specifics of Clojure, identifying the general properties of concurrency models that facilitate or hinder composition.

  1. Abstracts of SIG Sessions.

    Science.gov (United States)

    Proceedings of the ASIS Annual Meeting, 1995

    1995-01-01

    Presents abstracts of 15 special interest group (SIG) sessions. Topics include navigation and information utilization in the Internet, natural language processing, automatic indexing, image indexing, classification, users' models of database searching, online public access catalogs, education for information professions, information services,…

  2. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    Science.gov (United States)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  3. An automated method to build groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    Directory of Open Access Journals (Sweden)

    P. A. Marker

    2015-02-01

    Full Text Available Large-scale integrated hydrological models are important decision support tools in water resources management. The largest source of uncertainty in such models is the hydrostratigraphic model. Geometry and configuration of hydrogeological units are often poorly determined from hydrogeological data alone. Due to sparse sampling in space, lithological borehole logs may overlook structures that are important for groundwater flow at larger scales. Good spatial coverage along with high spatial resolution makes airborne time-domain electromagnetic (AEM data valuable for the structural input to large-scale groundwater models. We present a novel method to automatically integrate large AEM data-sets and lithological information into large-scale hydrological models. Clay-fraction maps are produced by translating geophysical resistivity into clay-fraction values using lithological borehole information. Voxel models of electrical resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study. Benchmarking hydrological performance by comparison of simulated hydrological state variables, the cluster model performed competitively. Calibrations of 11 hydrostratigraphic cluster models with 1–11 hydraulic conductivity zones showed improved hydrological performance with increasing number of clusters. Beyond the 5-cluster model hydrological performance did not improve. Due to reproducibility and possibility of method standardization and automation, we believe that hydrostratigraphic model generation with the proposed method has important prospects for groundwater models used in water resources management.

  4. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J. Prouty

    2006-07-14

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  5. Effective automated prediction of vertebral column pathologies based on logistic model tree with SMOTE preprocessing.

    Science.gov (United States)

    Karabulut, Esra Mahsereci; Ibrikci, Turgay

    2014-05-01

    This study develops a logistic model tree based automation system based on for accurate recognition of types of vertebral column pathologies. Six biomechanical measures are used for this purpose: pelvic incidence, pelvic tilt, lumbar lordosis angle, sacral slope, pelvic radius and grade of spondylolisthesis. A two-phase classification model is employed in which the first step is preprocessing the data by use of Synthetic Minority Over-sampling Technique (SMOTE), and the second one is feeding the classifier Logistic Model Tree (LMT) with the preprocessed data. We have achieved an accuracy of 89.73 %, and 0.964 Area Under Curve (AUC) in computer based automatic detection of the pathology. This was validated via a 10-fold-cross-validation experiment conducted on clinical records of 310 patients. The study also presents a comparative analysis of the vertebral column data with the use of several machine learning algorithms.

  6. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    Directory of Open Access Journals (Sweden)

    Ayoub BAHNASSE

    2014-12-01

    Full Text Available The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different architectures of the Dynamic Multipoint Virtual Private Network. Several research studies have been conducted to automate the generation and simulation of complex networks under various simulators, according to our research no work has dealt with the Dynamic Multipoint Virtual Private Network. In this paper we present a simulation model of the Dynamic and Multipoint Virtual Private network in OPNET Modeler, and a WEB-based tool for project management on the same network.

  7. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    Directory of Open Access Journals (Sweden)

    Y. Mao

    2014-07-01

    distributed automated extraction of drainage network model (Adam was proposed in the study. The Adam model has two features: (1 searching upward from outlet of basin instead of sink filling, (2 dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales.

  8. Improved automated diagnosis of misfire in internal combustion engines based on simulation models

    Science.gov (United States)

    Chen, Jian; Bond Randall, Robert

    2015-12-01

    In this paper, a new advance in the application of Artificial Neural Networks (ANNs) to the automated diagnosis of misfires in Internal Combustion engines(IC engines) is detailed. The automated diagnostic system comprises three stages: fault detection, fault localization and fault severity identification. Particularly, in the severity identification stage, separate Multi-Layer Perceptron networks (MLPs) with saturating linear transfer functions were designed for individual speed conditions, so they could achieve finer classification. In order to obtain sufficient data for the network training, numerical simulation was used to simulate different ranges of misfires in the engine. The simulation models need to be updated and evaluated using experimental data, so a series of experiments were first carried out on the engine test rig to capture the vibration signals for both normal condition and with a range of misfires. Two methods were used for the misfire diagnosis: one is based on the torsional vibration signals of the crankshaft and the other on the angular acceleration signals (rotational motion) of the engine block. Following the signal processing of the experimental and simulation signals, the best features were selected as the inputs to ANN networks. The ANN systems were trained using only the simulated data and tested using real experimental cases, indicating that the simulation model can be used for a wider range of faults for which it can still be considered valid. The final results have shown that the diagnostic system based on simulation can efficiently diagnose misfire, including location and severity.

  9. Semi-automated calibration method for modelling of mountain permafrost evolution in Switzerland

    Science.gov (United States)

    Marmy, Antoine; Rajczak, Jan; Delaloye, Reynald; Hilbich, Christin; Hoelzle, Martin; Kotlarski, Sven; Lambiel, Christophe; Noetzli, Jeannette; Phillips, Marcia; Salzmann, Nadine; Staub, Benno; Hauck, Christian

    2016-11-01

    Permafrost is a widespread phenomenon in mountainous regions of the world such as the European Alps. Many important topics such as the future evolution of permafrost related to climate change and the detection of permafrost related to potential natural hazards sites are of major concern to our society. Numerical permafrost models are the only tools which allow for the projection of the future evolution of permafrost. Due to the complexity of the processes involved and the heterogeneity of Alpine terrain, models must be carefully calibrated, and results should be compared with observations at the site (borehole) scale. However, for large-scale applications, a site-specific model calibration for a multitude of grid points would be very time-consuming. To tackle this issue, this study presents a semi-automated calibration method using the Generalized Likelihood Uncertainty Estimation (GLUE) as implemented in a 1-D soil model (CoupModel) and applies it to six permafrost sites in the Swiss Alps. We show that this semi-automated calibration method is able to accurately reproduce the main thermal condition characteristics with some limitations at sites with unique conditions such as 3-D air or water circulation, which have to be calibrated manually. The calibration obtained was used for global and regional climate model (GCM/RCM)-based long-term climate projections under the A1B climate scenario (EU-ENSEMBLES project) specifically downscaled at each borehole site. The projection shows general permafrost degradation with thawing at 10 m, even partially reaching 20 m depth by the end of the century, but with different timing among the sites and with partly considerable uncertainties due to the spread of the applied climatic forcing.

  10. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  11. A new automated method for analysis of gated-SPECT images based on a three-dimensional heart shaped model

    DEFF Research Database (Denmark)

    Lomsky, Milan; Richter, Jens; Johansson, Lena

    2005-01-01

    A new automated method for quantification of left ventricular function from gated-single photon emission computed tomography (SPECT) images has been developed. The method for quantification of cardiac function (CAFU) is based on a heart shaped model and the active shape algorithm. The model...

  12. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  13. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  14. An Accuracy Assessment of Automated Photogrammetric Techniques for 3d Modeling of Complex Interiors

    Science.gov (United States)

    Georgantas, A.; Brédif, M.; Pierrot-Desseilligny, M.

    2012-07-01

    This paper presents a comparison of automatic photogrammetric techniques to terrestrial laser scanning for 3D modelling of complex interior spaces. We try to evaluate the automated photogrammetric techniques not only in terms of their geometric quality compared to laser scanning but also in terms of cost in money, acquisition and computational time. To this purpose we chose as test site a modern building's stairway. APERO/MICMAC ( ©IGN )which is an Open Source photogrammetric software was used for the production of the 3D photogrammetric point cloud which was compared to the one acquired by a Leica Scanstation 2 laser scanner. After performing various qualitative and quantitative controls we present the advantages and disadvantages of each 3D modelling method applied in a complex interior of a modern building.

  15. Chemical Kinetics of Hydrogen Atom Abstraction from Allylic Sites by (3)O2; Implications for Combustion Modeling and Simulation.

    Science.gov (United States)

    Zhou, Chong-Wen; Simmie, John M; Somers, Kieran P; Goldsmith, C Franklin; Curran, Henry J

    2017-03-09

    Hydrogen atom abstraction from allylic C-H bonds by molecular oxygen plays a very important role in determining the reactivity of fuel molecules having allylic hydrogen atoms. Rate constants for hydrogen atom abstraction by molecular oxygen from molecules with allylic sites have been calculated. A series of molecules with primary, secondary, tertiary, and super secondary allylic hydrogen atoms of alkene, furan, and alkylbenzene families are taken into consideration. Those molecules include propene, 2-butene, isobutene, 2-methylfuran, and toluene containing the primary allylic hydrogen atom; 1-butene, 1-pentene, 2-ethylfuran, ethylbenzene, and n-propylbenzene containing the secondary allylic hydrogen atom; 3-methyl-1-butene, 2-isopropylfuran, and isopropylbenzene containing tertiary allylic hydrogen atom; and 1-4-pentadiene containing super allylic secondary hydrogen atoms. The M06-2X/6-311++G(d,p) level of theory was used to optimize the geometries of all of the reactants, transition states, products and also the hinder rotation treatments for lower frequency modes. The G4 level of theory was used to calculate the electronic single point energies for those species to determine the 0 K barriers to reaction. Conventional transition state theory with Eckart tunnelling corrections was used to calculate the rate constants. The comparison between our calculated rate constants with the available experimental results from the literature shows good agreement for the reactions of propene and isobutene with molecular oxygen. The rate constant for toluene with O2 is about an order magnitude slower than that experimentally derived from a comprehensive model proposed by Oehlschlaeger and coauthors. The results clearly indicate the need for a more detailed investigation of the combustion kinetics of toluene oxidation and its key pyrolysis and oxidation intermediates. Despite this, our computed barriers and rate constants retain an important internal consistency. Rate constants

  16. 2nd International Workshop on Physics-Based Modelling of Material Properties and Experimental Observations with special focus on Fracture and Damage Mechanics: Book of Abstracts

    OpenAIRE

    Nilsson, Karl-Fredrik; YALCINKAYA Tuncay; Oren, Ersin Emre; Tekoğlu, Cihan

    2013-01-01

    This report covers the book of abstracts of the 2nd International Workshop on Physics Based Modelling of Material Properties and Experimental Observations, with special focus on Fracture and Damage Mechanics. The workshop is organized in the context of European Commission’s Enlargement and Integration Action, by the Joint Research Centre in collaboration with the TOBB University of Economics and Technology (TOBB ETU) on 15th-17th May 2013 in Antalya, Turkey. The abstracts of the keynote le...

  17. Parallelization and High-Performance Computing Enables Automated Statistical Inference of Multi-scale Models.

    Science.gov (United States)

    Jagiella, Nick; Rickert, Dennis; Theis, Fabian J; Hasenauer, Jan

    2017-02-22

    Mechanistic understanding of multi-scale biological processes, such as cell proliferation in a changing biological tissue, is readily facilitated by computational models. While tools exist to construct and simulate multi-scale models, the statistical inference of the unknown model parameters remains an open problem. Here, we present and benchmark a parallel approximate Bayesian computation sequential Monte Carlo (pABC SMC) algorithm, tailored for high-performance computing clusters. pABC SMC is fully automated and returns reliable parameter estimates and confidence intervals. By running the pABC SMC algorithm for ∼10(6) hr, we parameterize multi-scale models that accurately describe quantitative growth curves and histological data obtained in vivo from individual tumor spheroid growth in media droplets. The models capture the hybrid deterministic-stochastic behaviors of 10(5)-10(6) of cells growing in a 3D dynamically changing nutrient environment. The pABC SMC algorithm reliably converges to a consistent set of parameters. Our study demonstrates a proof of principle for robust, data-driven modeling of multi-scale biological systems and the feasibility of multi-scale model parameterization through statistical inference.

  18. Mountain Waves in High Resolution Forecast Models: Automated Diagnostics of Wave Severity and Impact on Surface Winds

    Directory of Open Access Journals (Sweden)

    Peter Sheridan

    2017-01-01

    Full Text Available An automated method producing a diagnostic of the severity of lee waves and their impacts on surface winds as represented in output from a high resolution linear numerical model (3D velocities over mountains (3DVOM covering several areas of the U.K. is discussed. Lee waves involving turbulent rotor activity or downslope windstorms represent a hazard to aviation and ground transport, and summary information of this kind is highly valuable as an efficient ‘heads-up’ for forecasters, for automated products or to feed into impact models. Automated diagnosis of lee wave surface effects presents a particular challenge due to the complexity of turbulent zones in the lee of irregular terrain. The method proposed quantifies modelled wind perturbations relative to those that would occur in the absence of lee waves for a given background wind, and diagnoses using it are found to be quite consistent between cases and for different ranges of U.K. hills. A recent upgrade of the operational U.K. limited area model, the U.K. Variable Resolution Model (UKV used for general forecasting at the Met Office means that it now resolves lee waves, and its performance is here demonstrated using comparisons with aircraft- and surface-based observations and the linear model. In the future, automated diagnostics may be adapted to use its output to routinely produce contiguous mesoscale maps of lee wave activity and surface impacts over the whole U.K.

  19. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  20. GAUSSIAN MIXTURE MODEL BASED LEVEL SET TECHNIQUE FOR AUTOMATED SEGMENTATION OF CARDIAC MR IMAGES

    Directory of Open Access Journals (Sweden)

    G. Dharanibai,

    2011-04-01

    Full Text Available In this paper we propose a Gaussian Mixture Model (GMM integrated level set method for automated segmentation of left ventricle (LV, right ventricle (RV and myocardium from short axis views of cardiacmagnetic resonance image. By fitting GMM to the image histogram, global pixel intensity characteristics of the blood pool, myocardium and background are estimated. GMM provides initial segmentation andthe segmentation solution is regularized using level set. Parameters for controlling the level set evolution are automatically estimated from the Bayesian inference classification of pixels. We propose a new speed function that combines edge and region information that stops the evolving level set at the myocardial boundary. Segmentation efficacy is analyzed qualitatively via visual inspection. Results show the improved performance of our of proposed speed function over the conventional Bayesian driven adaptive speed function in automatic segmentation of myocardium

  1. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    Directory of Open Access Journals (Sweden)

    J. F. Wellmann

    2015-11-01

    Full Text Available We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  2. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    Science.gov (United States)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  3. Automated As-Built Model Generation of Subway Tunnels from Mobile LiDAR Data.

    Science.gov (United States)

    Arastounia, Mostafa

    2016-09-13

    This study proposes fully-automated methods for as-built model generation of subway tunnels employing mobile Light Detection and Ranging (LiDAR) data. The employed dataset is acquired by a Velodyne HDL 32E and covers 155 m of a subway tunnel containing six million points. First, the tunnel's main axis and cross sections are extracted. Next, a preliminary model is created by fitting an ellipse to each extracted cross section. The model is refined by employing residual analysis and Baarda's data snooping method to eliminate outliers. The final model is then generated by applying least squares adjustment to outlier-free data. The obtained results indicate that the tunnel's main axis and 1551 cross sections at 0.1 m intervals are successfully extracted. Cross sections have an average semi-major axis of 7.8508 m with a standard deviation of 0.2 mm and semi-minor axis of 7.7509 m with a standard deviation of 0.1 mm. The average normal distance of points from the constructed model (average absolute error) is also 0.012 m. The developed algorithm is applicable to tunnels with any horizontal orientation and degree of curvature since it makes no assumptions, nor does it use any a priori knowledge regarding the tunnel's curvature and horizontal orientation.

  4. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  5. A logical partial equivalencerelation model of abstract interpretation%抽象解释的部分等价逻辑关系模型

    Institute of Scientific and Technical Information of China (English)

    王蓁蓁

    2015-01-01

    抽象解释由 CousotP 和 CousotR 于1977年提出,随后许多作者做了大量工作。从不同的角度构造了基于部分等价关系和逻辑部分等价关系一个模型,它与传统抽象解释模型根本不同,该模型并不是对具体系统在“近似”意义上的抽象,而是对原系统上的一切关系(包括逻辑关系)的抽象,因此它不是原系统的“简化”,而是原系统的一个“深化”。从而在此模型上提出的问题具有另外的特征,例如复杂性和多态性等问题。%interpretation was presented by CousotP and CousotR in 1977.Subsequently many scholars have done a lot of studies on it.Generally speaking,the classic abstract interpretation theory is developing within two equivalent formal frameworks of Galois connections and closure operators.This paper constructs a model based on partial equivalence relations and logical partial equivalence relations from a different perspective.It considers the abstract domain as a collection of“partial equivalence relations”and the semantic operator as a collection of “logical partial equivalence relations”.So this model is totally different from the traditional abstract interpretation models.Except that it requires that the concrete or abstract semantic operators have certain logical relations,the model never requires some special properties such as the mono-tonicity,so in this point it is different from the classic model.Besides,it is not an abstraction of a concrete system in an “ap-proximate”sense,but rather an abstraction of all relations(including logical relations)on the original system.Thus this model is not a “simplification”of the original system but rather a “complication”of the original system.In some sense,it disjoins the “quality”unit from the original system,and it may be more complex than the original system.The advantage of this model is that it embodies some logical relations concerned with the system functions

  6. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2005-08-25

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport

  7. Semi-automated calibration method for modelling of mountain permafrost evolution in Switzerland

    Directory of Open Access Journals (Sweden)

    A. Marmy

    2015-09-01

    Full Text Available Permafrost is a widespread phenomenon in the European Alps. Many important topics such as the future evolution of permafrost related to climate change and the detection of permafrost related to potential natural hazards sites are of major concern to our society. Numerical permafrost models are the only tools which facilitate the projection of the future evolution of permafrost. Due to the complexity of the processes involved and the heterogeneity of Alpine terrain, models must be carefully calibrated and results should be compared with observations at the site (borehole scale. However, a large number of local point data are necessary to obtain a broad overview of the thermal evolution of mountain permafrost over a larger area, such as the Swiss Alps, and the site-specific model calibration of each point would be time-consuming. To face this issue, this paper presents a semi-automated calibration method using the Generalized Likelihood Uncertainty Estimation (GLUE as implemented in a 1-D soil model (CoupModel and applies it to six permafrost sites in the Swiss Alps prior to long-term permafrost evolution simulations. We show that this automated calibration method is able to accurately reproduce the main thermal condition characteristics with some limitations at sites with unique conditions such as 3-D air or water circulation, which have to be calibrated manually. The calibration obtained was used for RCM-based long-term simulations under the A1B climate scenario specifically downscaled at each borehole site. The projection shows general permafrost degradation with thawing at 10 m, even partially reaching 20 m depths until the end of the century, but with different timing among the sites. The degradation is more rapid at bedrock sites whereas ice-rich sites with a blocky surface cover showed a reduced sensitivity to climate change. The snow cover duration is expected to be reduced drastically (between −20 to −37 % impacting the ground thermal

  8. Automated model selection in covariance estimation and spatial whitening of MEG and EEG signals.

    Science.gov (United States)

    Engemann, Denis A; Gramfort, Alexandre

    2015-03-01

    Magnetoencephalography and electroencephalography (M/EEG) measure non-invasively the weak electromagnetic fields induced by post-synaptic neural currents. The estimation of the spatial covariance of the signals recorded on M/EEG sensors is a building block of modern data analysis pipelines. Such covariance estimates are used in brain-computer interfaces (BCI) systems, in nearly all source localization methods for spatial whitening as well as for data covariance estimation in beamformers. The rationale for such models is that the signals can be modeled by a zero mean Gaussian distribution. While maximizing the Gaussian likelihood seems natural, it leads to a covariance estimate known as empirical covariance (EC). It turns out that the EC is a poor estimate of the true covariance when the number of samples is small. To address this issue the estimation needs to be regularized. The most common approach downweights off-diagonal coefficients, while more advanced regularization methods are based on shrinkage techniques or generative models with low rank assumptions: probabilistic PCA (PPCA) and factor analysis (FA). Using cross-validation all of these models can be tuned and compared based on Gaussian likelihood computed on unseen data. We investigated these models on simulations, one electroencephalography (EEG) dataset as well as magnetoencephalography (MEG) datasets from the most common MEG systems. First, our results demonstrate that different models can be the best, depending on the number of samples, heterogeneity of sensor types and noise properties. Second, we show that the models tuned by cross-validation are superior to models with hand-selected regularization. Hence, we propose an automated solution to the often overlooked problem of covariance estimation of M/EEG signals. The relevance of the procedure is demonstrated here for spatial whitening and source localization of MEG signals.

  9. Building Safe Concurrency Abstractions

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    2014-01-01

    Concurrent object-oriented programming in Beta is based on semaphores and coroutines and the ability to define high-level concurrency abstractions like monitors, and rendezvous-based communication, and their associated schedulers. The coroutine mechanism of SIMULA has been generalized...... into the notions of concurrent and alternating objects. Alternating objects may be used to start a cooperative thread for each possible blocking communication and is thus an alternative to asynchronous messages and guarded commands. Beta like SIMULA, the first OO language, was designed as a language for modeling...... as well as programming, and we describe how this has had an impact on the design of the language. Although Beta supports the definition of high-level concurrency abstractions, the use of these rely on the discipline of the programmer as is the case for Java and other mainstream OO languages. We introduce...

  10. World-wide distribution automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  11. From Abstract Art to Abstracted Artists

    Directory of Open Access Journals (Sweden)

    Romi Mikulinsky

    2016-11-01

    Full Text Available What lineage connects early abstract films and machine-generated YouTube videos? Hans Richter’s famous piece Rhythmus 21 is considered to be the first abstract film in the experimental tradition. The Webdriver Torso YouTube channel is composed of hundreds of thousands of machine-generated test patterns designed to check frequency signals on YouTube. This article discusses geometric abstraction vis-à-vis new vision, conceptual art and algorithmic art. It argues that the Webdriver Torso is an artistic marvel indicative of a form we call mathematical abstraction, which is art performed by computers and, quite possibly, for computers.

  12. The modeling of transfer of steering between automated vehicle and human driver using hybrid control framework

    NARCIS (Netherlands)

    Kaustubh, M.; Willemsen, D.M.C.; Mazo, M.

    2016-01-01

    Proponents of autonomous driving pursue driverless technologies, whereas others foresee a gradual transition where there will be automated driving systems that share the control of the vehicle with the driver. With such advances it becomes pertinent that the developed automated systems need to be sa

  13. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    Science.gov (United States)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  14. Warehouse automation

    OpenAIRE

    Pogačnik, Jure

    2017-01-01

    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...

  15. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  16. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  17. AUTOMATED FORMATION OF CALCULATION MODELS OF TURBOGENERATORS FOR SOFTWARE ENVIRONMENT FEMM

    Directory of Open Access Journals (Sweden)

    V.I. Milykh

    2015-08-01

    Full Text Available Attention is paid to the popular FEMM (Finite Element Method Magnetics program which is effective in the numerical calculations of the magnetic fields of electrical machines. The main problem of its using - high costs in time on the formation of a graphical model representing the design and on the formation of the physical model representing the materials properties and the winding currents of machines – is solved. For this purpose, principles of the automated formation of such models are developed and presented on the turbogenerator example. The task is performed by a program written in an algorithmic language Lua integrated into the package FEMM. The program is universal in terms of varying the geometry and dimensions of the designed turbogenerators. It uses a minimum of input information in a digital form representing the design of the whole turbogenerator and its fragments. A general structure of the Lua script is provided, significant parts of its text, the graphic results of work's phases, as well as explanations of the program and instructions for its use are given. Performance capabilities of the compiled Lua script are shown on the example of the real 340 MW turbogenerator.

  18. Automated Decisional Model for Optimum Economic Order Quantity Determination Using Price Regressive Rates

    Science.gov (United States)

    Roşu, M. M.; Tarbă, C. I.; Neagu, C.

    2016-11-01

    The current models for inventory management are complementary, but together they offer a large pallet of elements for solving complex problems of companies when wanting to establish the optimum economic order quantity for unfinished products, row of materials, goods etc. The main objective of this paper is to elaborate an automated decisional model for the calculus of the economic order quantity taking into account the price regressive rates for the total order quantity. This model has two main objectives: first, to determine the periodicity when to be done the order n or the quantity order q; second, to determine the levels of stock: lighting control, security stock etc. In this way we can provide the answer to two fundamental questions: How much must be ordered? When to Order? In the current practice, the business relationships with its suppliers are based on regressive rates for price. This means that suppliers may grant discounts, from a certain level of quantities ordered. Thus, the unit price of the products is a variable which depends on the order size. So, the most important element for choosing the optimum for the economic order quantity is the total cost for ordering and this cost depends on the following elements: the medium price per units, the stock cost, the ordering cost etc.

  19. Modal abstractions of concurrent behavior

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nanz, Sebastian; Nielson, Hanne Riis

    2011-01-01

    We present an effective algorithm for the automatic construction of finite modal transition systems as abstractions of potentially infinite concurrent processes. Modal transition systems are recognized as valuable abstractions for model checking because they allow for the validation as well as re...

  20. Application of the Netherlands Groundwater Model, LGM, for calculating concentration of nitrate and pesticides at abstraction wells in sandy soil areas of the Netherlands

    NARCIS (Netherlands)

    Kovar K; Pastoors MJH; Tiktak A; Gaalen FW van; LBG, LWD

    1998-01-01

    In a study aimed at assessing the impact of historical and future solute leaching into saturated groundwater, the quasi-three-dimensional RIVM groundwater model, LGM (version 2), was used for calculating pathlines, travel times and concentration breakthrough curves at 165 groundwater abstraction loc

  1. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  2. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  3. PRISMA for Abstracts

    DEFF Research Database (Denmark)

    Beller, Elaine M; Glasziou, Paul P; Altman, Douglas G;

    2013-01-01

    Elaine Beller and colleagues from the PRISMA for Abstracts group provide a reporting guidelines for reporting abstracts of systematic reviews in journals and at conferences.......Elaine Beller and colleagues from the PRISMA for Abstracts group provide a reporting guidelines for reporting abstracts of systematic reviews in journals and at conferences....

  4. Beyond captions: linking figures with abstract sentences in biomedical articles.

    Science.gov (United States)

    Bockhorst, Joseph P; Conroy, John M; Agarwal, Shashank; O'Leary, Dianne P; Yu, Hong

    2012-01-01

    Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an F1-score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org.

  5. Beyond captions: linking figures with abstract sentences in biomedical articles.

    Directory of Open Access Journals (Sweden)

    Joseph P Bockhorst

    Full Text Available Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an F1-score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org.

  6. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  7. Data Structure Analysis to Represent Basic Models of Finite State Automation

    Directory of Open Access Journals (Sweden)

    V. V. Gurenko

    2015-01-01

    Full Text Available Complex system engineering based on the automaton models requires a reasoned data structure selection to implement them. The problem of automaton representation and data structure selection to be used in it has been understudied. Arbitrary data structure selection for automaton model software implementation leads to unnecessary computational burden and reduces the developed system efficiency. This article proposes an approach to the reasoned selection of data structures to represent finite algoristic automaton basic models and gives practical considerations based on it.Static and dynamic data structures are proposed for three main ways to assign Mealy and Moore automatons: a transition table, a matrix of coupling and a transition graph. A thirddimensional array, a rectangular matrix and a matrix of lists are the static structures. Dynamic structures are list-oriented structures: two-level and three-level Ayliff vectors and a multi-linked list. These structures allow us to store all required information about finite state automaton model components - characteristic set cardinalities and data of transition and output functions.A criterion system is proposed for data structure comparative evaluation in virtue of algorithmic features of automata theory problems. The criteria focused on capacitive and time computational complexity of operations performed in tasks such as equivalent automaton conversions, proving of automaton equivalence and isomorphism, and automaton minimization.A data structure comparative analysis based on the criterion system has done for both static and dynamic type. The analysis showed advantages of the third-dimensional array, matrix and two-level Ayliff vector. These are structures that assign automaton by transition table. For these structures an experiment was done to measure the execution time of automation operations included in criterion system.The analysis of experiment results showed that a dynamic structure - two

  8. Setup time reduction: SMED-balancing integrated model for manufacturing systems with automated transfer

    Directory of Open Access Journals (Sweden)

    Maurizio Faccio

    2013-10-01

    Full Text Available The importance of short setup times is increasing in every type of industry. It has been known how to address this problem for about 20 years. The SMED method, originally developed by the Japanese industrial engineer Shigeo Shingo for reducing the time to exchange dies, gives a really straightforward approach to improve existing setups. On the other hand, in the case of complex manufacturing systems the simple application of the SMED methodology is not enough. Manufacturing systems composed of different working machines with automated transfer facilities are a good example. Technologicalconstraints, task precedence constraints, and synchronization between different setup tasks are just some of the influencing factors that make an improved SMED desirable. The present paper, starting from an industrial case, aims to provide a heuristics methodology that integrates the traditional SMED with the workload balancing problem that is typical of assembly systems, in order to address the setup reduction problem in the case of complex manufacturing systems. Anindustrial case is reported to validate the proposed model and to demonstrate its practical implications.

  9. Modelling molecule-surface interactions--an automated quantum-classical approach using a genetic algorithm.

    Science.gov (United States)

    Herbers, Claudia R; Johnston, Karen; van der Vegt, Nico F A

    2011-06-14

    We present an automated and efficient method to develop force fields for molecule-surface interactions. A genetic algorithm (GA) is used to parameterise a classical force field so that the classical adsorption energy landscape of a molecule on a surface matches the corresponding landscape from density functional theory (DFT) calculations. The procedure performs a sophisticated search in the parameter phase space and converges very quickly. The method is capable of fitting a significant number of structures and corresponding adsorption energies. Water on a ZnO(0001) surface was chosen as a benchmark system but the method is implemented in a flexible way and can be applied to any system of interest. In the present case, pairwise Lennard Jones (LJ) and Coulomb potentials are used to describe the molecule-surface interactions. In the course of the fitting procedure, the LJ parameters are refined in order to reproduce the adsorption energy landscape. The classical model is capable of describing a wide range of energies, which is essential for a realistic description of a fluid-solid interface.

  10. Modeling and control of tissue compression and temperature for automation in robot-assisted surgery.

    Science.gov (United States)

    Sinha, Utkarsh; Li, Baichun; Sankaranarayanan, Ganesh

    2014-01-01

    Robotic surgery is being used widely due to its various benefits that includes reduced patient trauma and increased dexterity and ergonomics for the operating surgeon. Making the whole or part of the surgical procedure autonomous increases patient safety and will enable the robotic surgery platform to be used in telesurgery. In this work, an Electrosurgery procedure that involves tissue compression and application of heat such as the coaptic vessel closure has been automated. A MIMO nonlinear model characterizing the tissue stiffness and conductance under compression was feedback linearized and tuned PID controllers were used to control the system to achieve both the displacement and temperature constraints. A reference input for both the constraints were chosen as a ramp and hold trajectory which reflect the real constraints that exist in an actual surgical procedure. Our simulations showed that the controllers successfully tracked the reference trajectories with minimal deviation and in finite time horizon. The MIMO system with controllers developed in this work can be used to drive a surgical robot autonomously and perform electrosurgical procedures such as coaptic vessel closures.

  11. Automated segmentation and geometrical modeling of the tricuspid aortic valve in 3D echocardiographic images.

    Science.gov (United States)

    Pouch, Alison M; Wang, Hongzhi; Takabe, Manabu; Jackson, Benjamin M; Sehgal, Chandra M; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A

    2013-01-01

    The aortic valve has been described with variable anatomical definitions, and the consistency of 2D manual measurement of valve dimensions in medical image data has been questionable. Given the importance of image-based morphological assessment in the diagnosis and surgical treatment of aortic valve disease, there is considerable need to develop a standardized framework for 3D valve segmentation and shape representation. Towards this goal, this work integrates template-based medial modeling and multi-atlas label fusion techniques to automatically delineate and quantitatively describe aortic leaflet geometry in 3D echocardiographic (3DE) images, a challenging task that has been explored only to a limited extent. The method makes use of expert knowledge of aortic leaflet image appearance, generates segmentations with consistent topology, and establishes a shape-based coordinate system on the aortic leaflets that enables standardized automated measurements. In this study, the algorithm is evaluated on 11 3DE images of normal human aortic leaflets acquired at mid systole. The clinical relevance of the method is its ability to capture leaflet geometry in 3DE image data with minimal user interaction while producing consistent measurements of 3D aortic leaflet geometry.

  12. Parametric surface modeling and registration for comparison of manual and automated segmentation of the hippocampus.

    Science.gov (United States)

    Shen, Li; Firpi, Hiram A; Saykin, Andrew J; West, John D

    2009-06-01

    Accurate and efficient segmentation of the hippocampus from brain images is a challenging issue. Although experienced anatomic tracers can be reliable, manual segmentation is a time consuming process and may not be feasible for large-scale neuroimaging studies. In this article, we compare an automated method, FreeSurfer (V4), with a published manual protocol on the determination of hippocampal boundaries from magnetic resonance imaging scans, using data from an existing mild cognitive impairment/Alzheimer's disease cohort. To perform the comparison, we develop an enhanced spherical harmonic processing framework to model and register these hippocampal traces. The framework treats the two hippocampi as a single geometric configuration and extracts the positional, orientation, and shape variables in a multiobject setting. We apply this framework to register manual tracing and FreeSurfer results together and the two methods show stronger agreement on position and orientation than shape measures. Work is in progress to examine a refined FreeSurfer segmentation strategy and an improved agreement on shape features is expected.

  13. Automated sugar analysis

    Directory of Open Access Journals (Sweden)

    Tadeu Alcides MARQUES

    2016-03-01

    Full Text Available Abstract Sugarcane monosaccharides are reducing sugars, and classical analytical methodologies (Lane-Eynon, Benedict, complexometric-EDTA, Luff-Schoorl, Musson-Walker, Somogyi-Nelson are based on reducing copper ions in alkaline solutions. In Brazil, certain factories use Lane-Eynon, others use the equipment referred to as “REDUTEC”, and additional factories analyze reducing sugars based on a mathematic model. The objective of this paper is to understand the relationship between variations in millivolts, mass and tenors of reducing sugars during the analysis process. Another objective is to generate an automatic model for this process. The work herein uses the equipment referred to as “REDUTEC”, a digital balance, a peristaltic pump, a digital camcorder, math programs and graphics programs. We conclude that the millivolts, mass and tenors of reducing sugars exhibit a good mathematical correlation, and the mathematical model generated was benchmarked to low-concentration reducing sugars (<0.3%. Using the model created herein, reducing sugars analyses can be automated using the new equipment.

  14. Collaborative Model-based Systems Engineering for Cyber-Physical Systems, with a Building Automation Case Study

    DEFF Research Database (Denmark)

    Fitzgerald, John; Gamble, Carl; Payne, Richard

    2016-01-01

    cooperation between disciplines within a systems engineering process before the expensive commitment is made to integration in physical prototypes. We identify areas for future advances in foundations, methods and tools to realise the potential of a co-modelling approach within established systems engineering......We describe an approach to the model-based engineering of cyber-physical systems that permits the coupling of diverse discrete-event and continuous-time models and their simulators. A case study in the building automation domain demonstrates how such co-models and co-simulation can promote early...

  15. Global-scale assessment of groundwater depletion and related groundwater abstractions: Combining hydrological modeling with information from well observations and GRACE satellites

    Science.gov (United States)

    Döll, Petra; Müller Schmied, Hannes; Schuh, Carina; Portmann, Felix T.; Eicker, Annette

    2014-07-01

    Groundwater depletion (GWD) compromises crop production in major global agricultural areas and has negative ecological consequences. To derive GWD at the grid cell, country, and global levels, we applied a new version of the global hydrological model WaterGAP that simulates not only net groundwater abstractions and groundwater recharge from soils but also groundwater recharge from surface water bodies in dry regions. A large number of independent estimates of GWD as well as total water storage (TWS) trends determined from GRACE satellite data by three analysis centers were compared to model results. GWD and TWS trends are simulated best assuming that farmers in GWD areas irrigate at 70% of optimal water requirement. India, United States, Iran, Saudi Arabia, and China had the highest GWD rates in the first decade of the 21st century. On the Arabian Peninsula, in Libya, Egypt, Mali, Mozambique, and Mongolia, at least 30% of the abstracted groundwater was taken from nonrenewable groundwater during this time period. The rate of global GWD has likely more than doubled since the period 1960-2000. Estimated GWD of 113 km3/yr during 2000-2009, corresponding to a sea level rise of 0.31 mm/yr, is much smaller than most previous estimates. About 15% of the globally abstracted groundwater was taken from nonrenewable groundwater during this period. To monitor recent temporal dynamics of GWD and related water abstractions, GRACE data are best evaluated with a hydrological model that, like WaterGAP, simulates the impact of abstractions on water storage, but the low spatial resolution of GRACE remains a challenge.

  16. Accounting Automation

    OpenAIRE

    Laynebaril1

    2017-01-01

    Accounting Automation   Click Link Below To Buy:   http://hwcampus.com/shop/accounting-automation/  Or Visit www.hwcampus.com Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...

  17. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  18. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  19. Forecasting Macroeconomic Variables using Neural Network Models and Three Automated Model Selection Techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this paper we consider the forecasting performance of a well-defined class of flexible models, the so-called single hidden-layer feedforward neural network models. A major aim of our study is to find out whether they, due to their flexibility, are as useful tools in economic forecasting as some...... previous studies have indicated. When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. In fact, their parameters are not even globally...... on the linearisation idea: the Marginal Bridge Estimator and Autometrics. Second, one must decide whether forecasting should be carried out recursively or directly. Comparisons of these two methodss exist for linear models and here these comparisons are extended to neural networks. Finally, a nonlinear model...

  20. Model-Driven Development of Automation and Control Applications: Modeling and Simulation of Control Sequences

    Directory of Open Access Journals (Sweden)

    Timo Vepsäläinen

    2014-01-01

    Full Text Available The scope and responsibilities of control applications are increasing due to, for example, the emergence of industrial internet. To meet the challenge, model-driven development techniques have been in active research in the application domain. Simulations that have been traditionally used in the domain, however, have not yet been sufficiently integrated to model-driven control application development. In this paper, a model-driven development process that includes support for design-time simulations is complemented with support for simulating sequential control functions. The approach is implemented with open source tools and demonstrated by creating and simulating a control system model in closed-loop with a large and complex model of a paper industry process.

  1. 基于时间抽象状态机的AADL模型验证∗%Verification of AADL Models with Timed Abstract State Machines

    Institute of Scientific and Technical Information of China (English)

    杨志斌; 胡凯; 赵永望; 马殿富; Jean-Paul BODEVEIX

    2015-01-01

    This paper presents a formal verification method for AADL (architecture analysis and design language) models by TASM (timed abstract state machine) translation. The abstract syntax of the chosen subset of AADL and of TASM are given. The translation rules are defined clearly by the semantic functions expressed in a ML-like language. Furthermore, the translation is implemented in the model transformation tool AADL2TASM, which provides model checking and simulation for AADL models. Finally, a case study of space GNC (guidance, navigation and control) system is provided.%提出了一种基于时间抽象状态机(timed abstract state machine,简称TASM)的AADL(architecture analysis and design language)模型验证方法。分别给出了AADL子集和TASM的抽象语法,并基于语义函数和类ML的元语言形式定义转换规则。在此基础上,基于AADL开源建模环境OSATE(open source AADL tool environment)设计并实现了AADL模型验证与分析工具AADL2TASM,并基于航天器导航、制导与控制系统(guidance, navigation and control)进行了实例性验证。

  2. An automated system to simulate the River discharge in Kyushu Island using the H08 model

    Science.gov (United States)

    Maji, A.; Jeon, J.; Seto, S.

    2015-12-01

    Kyushu Island is located in southwestern part of Japan, and it is often affected by typhoons and a Baiu front. There have been severe water-related disasters recorded in Kyushu Island. On the other hand, because of high population density and for crop growth, water resource is an important issue of Kyushu Island.The simulation of river discharge is important for water resource management and early warning of water-related disasters. This study attempts to apply H08 model to simulate river discharge in Kyushu Island. Geospatial meteorological and topographical data were obtained from Japanese Ministry of Land, Infrastructure, Transport and Tourism (MLIT) and Automated Meteorological Data Acquisition System (AMeDAS) of Japan Meteorological Agency (JMA). The number of the observation stations of AMeDAS is limited and is not quite satisfactory for the application of water resources models in Kyushu. It is necessary to spatially interpolate the point data to produce grid dataset. Meteorological grid dataset is produced by considering elevation dependence. Solar radiation is estimated from hourly sunshine duration by a conventional formula. We successfully improved the accuracy of interpolated data just by considering elevation dependence and found out that the bias is related to geographical location. The rain/snow classification is done by H08 model and is validated by comparing estimated and observed snow rate. The estimates tend to be larger than the corresponding observed values. A system to automatically produce daily meteorological grid dataset is being constructed.The geospatial river network data were produced by ArcGIS and they were utilized in the H08 model to simulate the river discharge. Firstly, this research is to compare simulated and measured specific discharge, which is the ratio of discharge to watershed area. Significant error between simulated and measured data were seen in some rivers. Secondly, the outputs by the coupled model including crop growth

  3. Introduction to abstract algebra

    CERN Document Server

    Nicholson, W Keith

    2012-01-01

    Praise for the Third Edition ". . . an expository masterpiece of the highest didactic value that has gained additional attractivity through the various improvements . . ."-Zentralblatt MATH The Fourth Edition of Introduction to Abstract Algebra continues to provide an accessible approach to the basic structures of abstract algebra: groups, rings, and fields. The book's unique presentation helps readers advance to abstract theory by presenting concrete examples of induction, number theory, integers modulo n, and permutations before the abstract structures are defined. Readers can immediately be

  4. Abstraction and Consolidation

    Science.gov (United States)

    Monaghan, John; Ozmantar, Mehmet Fatih

    2006-01-01

    The framework for this paper is a recently developed theory of abstraction in context. The paper reports on data collected from one student working on tasks concerned with absolute value functions. It examines the relationship between mathematical constructions and abstractions. It argues that an abstraction is a consolidated construction that can…

  5. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    Science.gov (United States)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  6. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    Science.gov (United States)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  7. Reification of abstract concepts to improve comprehension using interactive virtual environments and a knowledge-based design: a renal physiology model.

    Science.gov (United States)

    Alverson, Dale C; Saiki, Stanley M; Caudell, Thomas P; Goldsmith, Timothy; Stevens, Susan; Saland, Linda; Colleran, Kathleen; Brandt, John; Danielson, Lee; Cerilli, Lisa; Harris, Alexis; Gregory, Martin C; Stewart, Randall; Norenberg, Jeffery; Shuster, George; Panaoitis; Holten, James; Vergera, Victor M; Sherstyuk, Andrei; Kihmm, Kathleen; Lui, Jack; Wang, Kin Lik

    2006-01-01

    Several abstract concepts in medical education are difficult to teach and comprehend. In order to address this challenge, we have been applying the approach of reification of abstract concepts using interactive virtual environments and a knowledge-based design. Reification is the process of making abstract concepts and events, beyond the realm of direct human experience, concrete and accessible to teachers and learners. Entering virtual worlds and simulations not otherwise easily accessible provides an opportunity to create, study, and evaluate the emergence of knowledge and comprehension from the direct interaction of learners with otherwise complex abstract ideas and principles by bringing them to life. Using a knowledge-based design process and appropriate subject matter experts, knowledge structure methods are applied in order to prioritize, characterize important relationships, and create a concept map that can be integrated into the reified models that are subsequently developed. Applying these principles, our interdisciplinary team has been developing a reified model of the nephron into which important physiologic functions can be integrated and rendered into a three dimensional virtual environment called Flatland, a virtual environments development software tool, within which a learners can interact using off-the-shelf hardware. The nephron model can be driven dynamically by a rules-based artificial intelligence engine, applying the rules and concepts developed in conjunction with the subject matter experts. In the future, the nephron model can be used to interactively demonstrate a number of physiologic principles or a variety of pathological processes that may be difficult to teach and understand. In addition, this approach to reification can be applied to a host of other physiologic and pathological concepts in other systems. These methods will require further evaluation to determine their impact and role in learning.

  8. Abstraction and Problem Reformulation

    Science.gov (United States)

    Giunchiglia, Fausto

    1992-01-01

    In work done jointly with Toby Walsh, the author has provided a sound theoretical foundation to the process of reasoning with abstraction (GW90c, GWS9, GW9Ob, GW90a). The notion of abstraction formalized in this work can be informally described as: (property 1), the process of mapping a representation of a problem, called (following historical convention (Sac74)) the 'ground' representation, onto a new representation, called the 'abstract' representation, which, (property 2) helps deal with the problem in the original search space by preserving certain desirable properties and (property 3) is simpler to handle as it is constructed from the ground representation by "throwing away details". One desirable property preserved by an abstraction is provability; often there is a relationship between provability in the ground representation and provability in the abstract representation. Another can be deduction or, possibly inconsistency. By 'throwing away details' we usually mean that the problem is described in a language with a smaller search space (for instance a propositional language or a language without variables) in which formulae of the abstract representation are obtained from the formulae of the ground representation by the use of some terminating rewriting technique. Often we require that the use of abstraction results in more efficient .reasoning. However, it might simply increase the number of facts asserted (eg. by allowing, in practice, the exploration of deeper search spaces or by implementing some form of learning). Among all abstractions, three very important classes have been identified. They relate the set of facts provable in the ground space to those provable in the abstract space. We call: TI abstractions all those abstractions where the abstractions of all the provable facts of the ground space are provable in the abstract space; TD abstractions all those abstractions wllere the 'unabstractions' of all the provable facts of the abstract space are

  9. Automated identification of potential snow avalanche release areas based on digital elevation models

    Directory of Open Access Journals (Sweden)

    Y. Bühler

    2013-05-01

    Full Text Available The identification of snow avalanche release areas is a very difficult task. The release mechanism of snow avalanches depends on many different terrain, meteorological, snowpack and triggering parameters and their interactions, which are very difficult to assess. In many alpine regions such as the Indian Himalaya, nearly no information on avalanche release areas exists mainly due to the very rough and poorly accessible terrain, the vast size of the region and the lack of avalanche records. However avalanche release information is urgently required for numerical simulation of avalanche events to plan mitigation measures, for hazard mapping and to secure important roads. The Rohtang tunnel access road near Manali, Himachal Pradesh, India, is such an example. By far the most reliable way to identify avalanche release areas is using historic avalanche records and field investigations accomplished by avalanche experts in the formation zones. But both methods are not feasible for this area due to the rough terrain, its vast extent and lack of time. Therefore, we develop an operational, easy-to-use automated potential release area (PRA detection tool in Python/ArcGIS which uses high spatial resolution digital elevation models (DEMs and forest cover information derived from airborne remote sensing instruments as input. Such instruments can acquire spatially continuous data even over inaccessible terrain and cover large areas. We validate our tool using a database of historic avalanches acquired over 56 yr in the neighborhood of Davos, Switzerland, and apply this method for the avalanche tracks along the Rohtang tunnel access road. This tool, used by avalanche experts, delivers valuable input to identify focus areas for more-detailed investigations on avalanche release areas in remote regions such as the Indian Himalaya and is a precondition for large-scale avalanche hazard mapping.

  10. Forecasting macroeconomic variables using neural network models and three automated model selection techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2016-01-01

    When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (QuickNet) that conv...

  11. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  12. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  13. Evidence evaluation in fingerprint comparison and automated fingerprint identification systems--modelling within finger variability.

    Science.gov (United States)

    Egli, Nicole M; Champod, Christophe; Margot, Pierre

    2007-04-11

    Recent challenges and errors in fingerprint identification have highlighted the need for assessing the information content of a papillary pattern in a systematic way. In particular, estimation of the statistical uncertainty associated with this type of evidence is more and more called upon. The approach used in the present study is based on the assessment of likelihood ratios (LRs). This evaluative tool weighs the likelihood of evidence given two mutually exclusive hypotheses. The computation of likelihood ratios on a database of marks of known sources (matching the unknown and non-matching the unknown mark) allows an estimation of the evidential contribution of fingerprint evidence. LRs are computed taking advantage of the scores obtained from an automated fingerprint identification system and hence are based exclusively on level II features (minutiae). The AFIS system attributes a score to any comparison (fingerprint to fingerprint, mark to mark and mark to fingerprint), used here as a proximity measure between the respective arrangements of minutiae. The numerator of the LR addresses the within finger variability and is obtained by comparing the same configurations of minutiae coming from the same source. Only comparisons where the same minutiae are visible both on the mark and on the print are therefore taken into account. The denominator of the LR is obtained by cross-comparison with a database of prints originating from non-matching sources. The estimation of the numerator of the LR is much more complex in terms of specific data requirements than the estimation of the denominator of the LR (that requires only a large database of prints from an non-associated population). Hence this paper addresses specific issues associated with the numerator or within finger variability. This study aims at answering the following questions: (1) how a database for modelling within finger variability should be acquired; (2) whether or not the visualisation technique or the

  14. MOAtox: A comprehensive mode of action and acute aquatic toxicity database for predictive model development (SETAC abstract)

    Science.gov (United States)

    The mode of toxic action (MOA) has been recognized as a key determinant of chemical toxicity and as an alternative to chemical class-based predictive toxicity modeling. However, the development of quantitative structure activity relationship (QSAR) and other models has been limit...

  15. Abstracts and program proceedings of the 1994 meeting of the International Society for Ecological Modelling North American Chapter

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, J.R.

    1994-06-01

    This document contains information about the 1994 meeting of the International Society for Ecological Modelling North American Chapter. The topics discussed include: extinction risk assessment modelling, ecological risk analysis of uranium mining, impacts of pesticides, demography, habitats, atmospheric deposition, and climate change.

  16. (abstract) A Test of the Theoretical Models of Bipolar Outflows: The Bipolar Outflow in Mon R2

    Science.gov (United States)

    Xie, Taoling; Goldsmith, Paul; Patel, Nimesh

    1993-01-01

    We report some results of a study of the massive bipolar outflow in the central region of the relatively nearby giant molecular cloud Monoceros R2. We make a quantative comparison of our results with the Shu et al. outflow model which incorporates a radially directed wind sweeping up the ambient material into a shell. We find that this simple model naturally explains the shape of this thin shell. Although Shu's model in its simplest form predicts with reasonable parameters too much mass at very small polar angles, as previously pointed out by Masson and Chernin, it provides a reasonable good fit to the mass distribution at larger polar angles. It is possible that this discrepancy is due to inhomogeneities of the ambient molecular gas which is not considered by the model. We also discuss the constraints imposed by these results on recent jet-driven outflow models.

  17. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    Science.gov (United States)

    Huizinga, Richard J.

    2014-01-01

    Streamflow data, basin characteristics, and rainfall data from 39 streamflow-gaging stations for urban areas in and adjacent to Missouri were used by the U.S. Geological Survey in cooperation with the Metropolitan Sewer District of St. Louis to develop an initial abstraction and constant loss model (a time-distributed basin-loss model) and a gamma unit hydrograph (GUH) for urban areas in Missouri. Study-specific methods to determine peak streamflow and flood volume for a given rainfall event also were developed.

  18. Designing Negotiating Agent for Automated Negotiations

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Traditional research in automated negotiation is focused on negotiation protocol and strategy.This paper studies automated negotiation from a new point of view, proposes a novel concept, namely negotiating agent, and discusses its significance in construction of automated negotiation system, with an abstract model formally described and the architecture designed, which supports both goal-directed reasoning and reactive response. A communication model was proposed to construct interaction mechanism used by negotiating agents, in which the negotiation language used by agents is defined.The communication model and the language are defined in a way general enough to support a wide variety of market mechanisms, thus being particularly suitable for flexible applications such as electronic business. The design and expression of the negotiation ontology is also discussed. On the base of the theoretical model of negotiating agent, negotiating agent architecture and negotiating agent communication model (NACM) are explicit and formal specifications for the agents negotiating in an E-business environment; especially, NACM defines the negotiation language template shared among all agents formally and explicitly. The novelty of the communication model is twofold.

  19. Users’ Manual and Validation of the Automated Grading System (AGS): Improving the Quality of Intelligence Summaries Using Feedback from an Unsupervised Model of Semantics

    Science.gov (United States)

    2012-12-01

    not necessary to include here abstracts in both official languages unless the text is bilingual .) The Automated Grading System (AGS) was...The first step for AGS is to provide it with a sample of language from which it will create semantic representations for terms. For the exercise...with 28 North Koreans on board was hijacked. In a second incident the same day, pirates attacked a Ukrainian cargo ship. Private security guards

  20. Extending and applying active appearance models for automated, high precision segmentation in different image modalities

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Fisker, Rune; Ersbøll, Bjarne Kjær

    2001-01-01

    , an initialization scheme is designed thus making the usage of AAMs fully automated. Using these extensions it is demonstrated that AAMs can segment bone structures in radiographs, pork chops in perspective images and the left ventricle in cardiovascular magnetic resonance images in a robust, fast and accurate...

  1. AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING

    Science.gov (United States)

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...

  2. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    Science.gov (United States)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an

  3. Automation of measurement of heights waves around a model ship; Mokeisen mawari no hako keisoku no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Ikehata, M.; Kato, M.; Yanagida, F. [Yokohama National University, Yokohama (Japan). Faculty of Engineering

    1997-10-01

    Trial fabrication and tests were performed on an instrument to automate measurement of heights of waves around a model ship. The currently used electric wave height measuring instrument takes long time for measurement, hence poor in efficiency. The method for processing optical images also has a problem in accuracy. Therefore, a computer controlled system was structured by using AC servo motors in driving the X and Y axes of a traverse equipment. Equipment was fabricated to automate the wave height measurement, in which four servo type wave height meters are installed on a moving rack in the lateral (Y-axial) direction so that wave heights to be measured by four meters can be measured automatically all at once. Wave heights can be measured continuously by moving the moving rack at a constant speed, verifying that wave shapes in longitudinal cross sections can be acquired by only one towing. Time required in the measurements using the instrument was 40 hours as a net time for fixed point measurement and 12 hours for continuous measurement, or 52 hours in total. On the other hand, the time may reach 240 hours for fixed point measurement when the conventional all-point manual traverse equipment is used. Enormous effects were obtained from automating the instrument. Collection of wave height data will continue also on tankers and other types of ships. 2 refs., 8 figs., 1 tab.

  4. Abstracts--Citations

    Science.gov (United States)

    Occupational Mental Health, 1972

    1972-01-01

    Provides abstracts and citations of journal articles and reports dealing with aspects of mental health. Topics include absenteeism, alcoholism, drug abuse, leisure, disadvantaged, job satisfaction, and others. (SB)

  5. A model composition for Mars derived from the oxygen isotopic ratios of martian/SNC meteorites. [Abstract only

    Science.gov (United States)

    Delaney, J. S.

    1994-01-01

    Oxygen is the most abundant element in most meteorites, yet the ratios of its isotopes are seldom used to constrain the compositional history of achondrites. The two major achondrite groups have O isotope signatures that differ from any plausible chondritic precursors and lie between the ordinary and carbonaceous chondrite domains. If the assumption is made that the present global sampling of chondritic meteorites reflects the variability of O reservoirs at the time of planetessimal/planet aggregation in the early nebula, then the O in these groups must reflect mixing between known chondritic reservoirs. This approach, in combination with constraints based on Fe-Mn-Mg systematics, has been used previously to model the composition of the basaltic achondrite parent body (BAP) and provides a model precursor composition that is generally consistent with previous eucrite parent body (EPB) estimates. The same approach is applied to Mars exploiting the assumption that the SNC and related meteorites sample the martian lithosphere. Model planet and planetesimal compositions can be derived by mixing of known chondritic components using O isotope ratios as the fundamental compositional constraint. The major- and minor-element composition for Mars derived here and that derived previously for the basaltic achondrite parent body are, in many respects, compatible with model compositions generated using completely independent constraints. The role of volatile elements and alkalis in particular remains a major difficulty in applying such models.

  6. A Bayesian network model for predicting aquatic toxicity mode of action using two dimensional theoretical molecular descriptors-abstract

    Science.gov (United States)

    The mode of toxic action (MoA) has been recognized as a key determinant of chemical toxicity but MoA classification in aquatic toxicology has been limited. We developed a Bayesian network model to classify aquatic toxicity mode of action using a recently published dataset contain...

  7. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  8. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco

    2012-01-01

    Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete events...... automatically – models of several realistic software components, including the biometric passport and the SIP protocol....... are allowed. Our approach uses counterexample-guided abstraction refinement: whenever the current abstraction is too coarse and induces nondeterministic behavior, the abstraction is refined automatically. Using Tomte, a prototype tool implementing our algorithm, we have succeeded to learn – fully...

  9. Nuclear medicine. Abstracts; Nuklearmedizin 2000. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2000-07-01

    This issue of the journal contains the abstracts of the 183 conference papers as well as 266 posters presented at the conference. Subject fields covered are: Neurology, psychology, oncology, pediatrics, radiopharmacy, endocrinology, EDP, measuring equipment and methods, radiological protection, cardiology, and therapy. (orig./CB) [German] Die vorliegende Zeitschrift enthaelt die Kurzfassungen der 183 auf der Tagung gehaltenen Vortraege sowie der 226 praesentierten Poster, die sich mit den folgenden Themen befassten: Neurologie, Psychiatrie, Onkologie, Paediatrie, Radiopharmazie, Endokrinologie, EDV, Messtechnik, Strahlenschutz, Kardiologie sowie Therapie. (MG)

  10. An abstract machine based execution model for computer architecture design and efficient implementation of logic programs in parallel.

    OpenAIRE

    Hermenegildo, Manuel V.

    1986-01-01

    The term "Logic Programming" refers to a variety of computer languages and execution models which are based on the traditional concept of Symbolic Logic. The expressive power of these languages offers promise to be of great assistance in facing the programming challenges of present and future symbolic processing applications in Artificial Intelligence, Knowledge-based systems, and many other areas of computing. The sequential execution speed of logic programs has been greatly improved sinc...

  11. Completeness of Lyapunov Abstraction

    Directory of Open Access Journals (Sweden)

    Rafael Wisniewski

    2013-08-01

    Full Text Available In this work, we continue our study on discrete abstractions of dynamical systems. To this end, we use a family of partitioning functions to generate an abstraction. The intersection of sub-level sets of the partitioning functions defines cells, which are regarded as discrete objects. The union of cells makes up the state space of the dynamical systems. Our construction gives rise to a combinatorial object - a timed automaton. We examine sound and complete abstractions. An abstraction is said to be sound when the flow of the time automata covers the flow lines of the dynamical systems. If the dynamics of the dynamical system and the time automaton are equivalent, the abstraction is complete. The commonly accepted paradigm for partitioning functions is that they ought to be transversal to the studied vector field. We show that there is no complete partitioning with transversal functions, even for particular dynamical systems whose critical sets are isolated critical points. Therefore, we allow the directional derivative along the vector field to be non-positive in this work. This considerably complicates the abstraction technique. For understanding dynamical systems, it is vital to study stable and unstable manifolds and their intersections. These objects appear naturally in this work. Indeed, we show that for an abstraction to be complete, the set of critical points of an abstraction function shall contain either the stable or unstable manifold of the dynamical system.

  12. Designing for Mathematical Abstraction

    Science.gov (United States)

    Pratt, Dave; Noss, Richard

    2010-01-01

    Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process we refer to as "designing for abstraction." In this paper, we draw on detailed design experiments from our research on children's understanding about chance and distribution to re-present this work as a case study in designing…

  13. Truthful Monadic Abstractions

    DEFF Research Database (Denmark)

    Brock-Nannestad, Taus; Schürmann, Carsten

    2012-01-01

    indefinitely, finding neither a proof nor a disproof of a given subgoal. In this paper we characterize a family of truth-preserving abstractions from intuitionistic first-order logic to the monadic fragment of classical first-order logic. Because they are truthful, these abstractions can be used to disprove...

  14. Completeness of Lyapunov Abstraction

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Sloth, Christoffer

    2013-01-01

    This paper addresses the generation of complete abstractions of polynomial dynamical systems by timed automata. For the proposed abstraction, the state space is divided into cells by sublevel sets of functions. We identify a relation between these functions and their directional derivatives along...

  15. The Conversion of Cardiovascular Conference Abstracts to Publications

    DEFF Research Database (Denmark)

    Fosbøl, Emil L.; Fosbøl, Philip Loldrup; Harrington, Robert A.

    2012-01-01

    a systematic and automated evaluation of rates, timing, and correlates of publication from scientific abstracts presented at 3 major cardiovascular conferences. Methods and Results—Using an automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed publications of abstracts....... From 2006 to 2008, 11 365, 5005, and 10 838 abstracts were presented at the AHA, ACC, and ESC meetings, respectively. Overall, 30.6% of presented abstracts were published within 2 years of the conference; ranging from 34.5% for AHA to 29.5% for ACC to 27.0% for ESC (P0.0001). Five years after...... conference presentation in 2005, these rates had risen slightly to 49.7% for AHA, 42.6% for ACC, and 37.6% for ESC (P0.0001). After adjustment for abstract characteristics and contributing countries, abstracts presented at the AHA meeting remained more likely for publication relative to the ESC (adjusted...

  16. Automated Kinematic Modelling of Warped Galaxy Discs in Large Hi Surveys: 3D Tilted Ring Fitting of HI Emission Cubes

    CERN Document Server

    Kamphuis, P; Oh, S- H; Spekkens, K; Urbancic, N; Serra, P; Koribalski, B S; Dettmar, R -J

    2015-01-01

    Kinematical parameterisations of disc galaxies, employing emission line observations, are indispensable tools for studying the formation and evolution of galaxies. Future large-scale HI surveys will resolve the discs of many thousands of galaxies, allowing a statistical analysis of their disc and halo kinematics, mass distribution and dark matter content. Here we present an automated procedure which fits tilted-ring models to Hi data cubes of individual, well-resolved galaxies. The method builds on the 3D Tilted Ring Fitting Code (TiRiFiC) and is called FAT (Fully Automated TiRiFiC). To assess the accuracy of the code we apply it to a set of 52 artificial galaxies and 25 real galaxies from the Local Volume HI Survey (LVHIS). Using LVHIS data, we compare our 3D modelling to the 2D modelling methods DiskFit and rotcur. A conservative result is that FAT accurately models the kinematics and the morphologies of galaxies with an extent of eight beams across the major axis in the inclination range 20$^{\\circ}$-90$^{...

  17. Abstraction of Drift Seepage

    Energy Technology Data Exchange (ETDEWEB)

    J.T. Birkholzer

    2004-11-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package

  18. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  19. Dynamic detection model and its application for perimeter security, intruder detection, and automated target recognition

    Science.gov (United States)

    Koltunov, Joseph; Koltunov, Alexander

    2003-09-01

    Under unsteady weather conditions (gusty wind and partial cloudiness), the pixel intensities measured by infrared or optical imaging sensors may change considerably within even minutes. This makes a principal obstacle to automated target detection and recognition in real, outdoor settings. Currently existing automated recognition algorithms require strong similarity between the weather conditions of training and recognition. Empirical attempts to normalize image intensities do not lead to reliable detection in practice (e.g. for scenes with a complex relief). Also if the weather is relatively stable (weak wind, rare clouds), as short as 15-20 minutes delay between the training survey and the recognition survey may badly affect target recognition or detection, unless the targets are well separable from background. Thermal IR technologies based on invariants such as emissivity and thermal inertia are expensive and ineffective in making the recognition automated. Our approach to overcoming the problem is to take advantage of multitemporal prior surveying. It exploits the fact, that any new infrared or optical image of a scene can be accurately predicted based on sufficiently many scene images acquired previously. This removes the above severe constraints to variability of the weather conditions, whereas neither meteorological measurement nor radiometric calibration of the sensor are required. The present paper further generalizes the approach and addresses several points that are important for putting the ideas in practice. Two experimental examples: intruder detection and recognition of a suspicious target illustrate the potential of our method.

  20. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  1. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  2. Efficient abstraction selection in reinforcement learning

    NARCIS (Netherlands)

    Seijen, H. van; Whiteson, S.; Kester, L.

    2013-01-01

    This paper introduces a novel approach for abstraction selection in reinforcement learning problems modelled as factored Markov decision processes (MDPs), for which a state is described via a set of state components. In abstraction selection, an agent must choose an abstraction from a set of candida

  3. (abstract) Using TOPEX/Poseidon Sea Level Observations to Test the Sensitivity of an Ocean Model to Wind Forcing

    Science.gov (United States)

    Fu, Lee-Lueng; Chao, Yi

    1996-01-01

    It has been demonstrated that current-generation global ocean general circulation models (OGCM) are able to simulate large-scale sea level variations fairly well. In this study, a GFDL/MOM-based OGCM was used to investigate its sensitivity to different wind forcing. Simulations of global sea level using wind forcing from the ERS-1 Scatterometer and the NMC operational analysis were compared to the observations made by the TOPEX/Poseidon (T/P) radar altimeter for a two-year period. The result of the study has demonstrated the sensitivity of the OGCM to the quality of wind forcing, as well as the synergistic use of two spaceborne sensors in advancing the study of wind-driven ocean dynamics.

  4. Performance of a semi-automated approach for risk estimation using a common data model for longitudinal healthcare databases.

    Science.gov (United States)

    Van Le, Hoa; Beach, Kathleen J; Powell, Gregory; Pattishall, Ed; Ryan, Patrick; Mera, Robertino M

    2013-02-01

    Different structures and coding schemes may limit rapid evaluation of a large pool of potential drug safety signals using multiple longitudinal healthcare databases. To overcome this restriction, a semi-automated approach utilising common data model (CDM) and robust pharmacoepidemiologic methods was developed; however, its performance needed to be evaluated. Twenty-three established drug-safety associations from publications were reproduced in a healthcare claims database and four of these were also repeated in electronic health records. Concordance and discrepancy of pairwise estimates were assessed between the results derived from the publication and results from this approach. For all 27 pairs, an observed agreement between the published results and the results from the semi-automated approach was greater than 85% and Kappa coefficient was 0.61, 95% CI: 0.19-1.00. Ln(IRR) differed by less than 50% for 13/27 pairs, and the IRR varied less than 2-fold for 19/27 pairs. Reproducibility based on the intra-class correlation coefficient was 0.54. Most covariates (>90%) in the publications were available for inclusion in the models. Once the study populations and inclusion/exclusion criteria were obtained from the literature, the analysis was able to be completed in 2-8 h. The semi-automated methodology using a CDM produced consistent risk estimates compared to the published findings for most selected drug-outcome associations, regardless of original study designs, databases, medications and outcomes. Further assessment of this approach is useful to understand its roles, strengths and limitations in rapidly evaluating safety signals.

  5. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    and class instantiations. Our teaching experience shows that many novice programmers find it difficult to write programs with abstractions that materialise to concrete objects later in the development process. The contribution of this paper is the idea of initiating a programming process by creating......In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls...... or capturing concrete values, objects, or actions. As the next step, some of these are lifted to a higher level by computational means. In the object-oriented paradigm the target of such steps is classes. We hypothesise that the proposed approach primarily will be beneficial to novice programmers or during...

  6. Abstracts of SIG Sessions.

    Science.gov (United States)

    Proceedings of the ASIS Annual Meeting, 1997

    1997-01-01

    Presents abstracts of SIG Sessions. Highlights include digital collections; information retrieval methods; public interest/fair use; classification and indexing; electronic publication; funding; globalization; information technology projects; interface design; networking in developing countries; metadata; multilingual databases; networked…

  7. Abstract sectional category

    CERN Document Server

    Diaz, F; Garcia, P; Murillo, A; Remedios, J

    2011-01-01

    We study, in an abstract axiomatic setting, the notion of sectional category of a morphism. From this, we unify and generalize known results about this invariant in different settings as well as we deduce new applications.

  8. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  9. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    Science.gov (United States)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  10. Abstracts of contributed papers

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This volume contains 571 abstracts of contributed papers to be presented during the Twelfth US National Congress of Applied Mechanics. Abstracts are arranged in the order in which they fall in the program -- the main sessions are listed chronologically in the Table of Contents. The Author Index is in alphabetical order and lists each paper number (matching the schedule in the Final Program) with its corresponding page number in the book.

  11. A knowledge- and model-based system for automated weaning from mechanical ventilation: technical description and first clinical application.

    Science.gov (United States)

    Schädler, Dirk; Mersmann, Stefan; Frerichs, Inéz; Elke, Gunnar; Semmel-Griebeler, Thomas; Noll, Oliver; Pulletz, Sven; Zick, Günther; David, Matthias; Heinrichs, Wolfgang; Scholz, Jens; Weiler, Norbert

    2014-10-01

    To describe the principles and the first clinical application of a novel prototype automated weaning system called Evita Weaning System (EWS). EWS allows an automated control of all ventilator settings in pressure controlled and pressure support mode with the aim of decreasing the respiratory load of mechanical ventilation. Respiratory load takes inspired fraction of oxygen, positive end-expiratory pressure, pressure amplitude and spontaneous breathing activity into account. Spontaneous breathing activity is assessed by the number of controlled breaths needed to maintain a predefined respiratory rate. EWS was implemented as a knowledge- and model-based system that autonomously and remotely controlled a mechanical ventilator (Evita 4, Dräger Medical, Lübeck, Germany). In a selected case study (n = 19 patients), ventilator settings chosen by the responsible physician were compared with the settings 10 min after the start of EWS and at the end of the study session. Neither unsafe ventilator settings nor failure of the system occurred. All patients were successfully transferred from controlled ventilation to assisted spontaneous breathing in a mean time of 37 ± 17 min (± SD). Early settings applied by the EWS did not significantly differ from the initial settings, except for the fraction of oxygen in inspired gas. During the later course, EWS significantly modified most of the ventilator settings and reduced the imposed respiratory load. A novel prototype automated weaning system was successfully developed. The first clinical application of EWS revealed that its operation was stable, safe ventilator settings were defined and the respiratory load of mechanical ventilation was decreased.

  12. Behaviorally Modeling Games of Strategy Using Descriptive Q-learning

    Science.gov (United States)

    2013-01-01

    REPORT Behaviorally Modeling Games of Strategy Using Descriptive Q-learning 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: Modeling human decision making... Games of Strategy Using Descriptive Q-learning Report Title ABSTRACT Modeling human decision making in strategic problem domains is challenging with...an unknown automated opponent. Behaviorally Modeling Games of Strategy Using Descriptive Q-learning Roi Ceren Department of Computer Science

  13. ModelOMatic: fast and automated model selection between RY, nucleotide, amino acid, and codon substitution models.

    Science.gov (United States)

    Whelan, Simon; Allen, James E; Blackburne, Benjamin P; Talavera, David

    2015-01-01

    Molecular phylogenetics is a powerful tool for inferring both the process and pattern of evolution from genomic sequence data. Statistical approaches, such as maximum likelihood and Bayesian inference, are now established as the preferred methods of inference. The choice of models that a researcher uses for inference is of critical importance, and there are established methods for model selection conditioned on a particular type of data, such as nucleotides, amino acids, or codons. A major limitation of existing model selection approaches is that they can only compare models acting upon a single type of data. Here, we extend model selection to allow comparisons between models describing different types of data by introducing the idea of adapter functions, which project aggregated models onto the originally observed sequence data. These projections are implemented in the program ModelOMatic and used to perform model selection on 3722 families from the PANDIT database, 68 genes from an arthropod phylogenomic data set, and 248 genes from a vertebrate phylogenomic data set. For the PANDIT and arthropod data, we find that amino acid models are selected for the overwhelming majority of alignments; with progressively smaller numbers of alignments selecting codon and nucleotide models, and no families selecting RY-based models. In contrast, nearly all alignments from the vertebrate data set select codon-based models. The sequence divergence, the number of sequences, and the degree of selection acting upon the protein sequences may contribute to explaining this variation in model selection. Our ModelOMatic program is fast, with most families from PANDIT taking fewer than 150 s to complete, and should therefore be easily incorporated into existing phylogenetic pipelines. ModelOMatic is available at https://code.google.com/p/modelomatic/.

  14. Automated delineation of karst sinkholes from LiDAR-derived digital elevation models

    Science.gov (United States)

    Wu, Qiusheng; Deng, Chengbin; Chen, Zuoqi

    2016-08-01

    Sinkhole mapping is critical for understanding hydrological processes and mitigating geological hazards in karst landscapes. Current methods for identifying sinkholes are primarily based on visual interpretation of low-resolution topographic maps and aerial photographs with subsequent field verification, which is labor-intensive and time-consuming. The increasing availability of high-resolution LiDAR-derived digital elevation data allows for an entirely new level of detailed delineation and analyses of small-scale geomorphologic features and landscape structures at fine scales. In this paper, we present a localized contour tree method for automated extraction of sinkholes in karst landscapes. One significant advantage of our automated approach for sinkhole extraction is that it may reduce inconsistencies and alleviate repeatability concerns associated with visual interpretation methods. In addition, the proposed method has contributed to improving the sinkhole inventory in several ways: (1) detection of non-inventoried sinkholes; (2) identification of previously inventoried sinkholes that have been filled; (3) delineation of sinkhole boundaries; and (4) characterization of sinkhole morphometric properties. We applied the method to Fillmore County in southeastern Minnesota, USA, and identified three times as many sinkholes as the existing database for the same area. The results suggest that previous visual interpretation method might significantly underestimate the number of potential sinkholes in the region. Our method holds great potential for creating and updating sinkhole inventory databases at a regional scale in a timely manner.

  15. Automation of block assignment planning using a diagram-based scenario modeling method

    Science.gov (United States)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  16. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    Hwang In Hyuck

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  17. The scheme of combined application of optimization and simulation models for formation of an optimum structure of an automated control system of space systems

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Nikiforov, A. Yu; Zelenkov, P. V.

    2016-11-01

    With the development of automated control systems of space systems, there are new classes of spacecraft that requires improvement of their structure and expand their functions. When designing the automated control system of space systems occurs various tasks such as: determining location of elements and subsystems in the space, hardware selection, the distribution of the set of functions performed by the system units, all of this under certain conditions on the quality of control and connectivity of components. The problem of synthesis of structure of automated control system of space systems formalized using discrete variables at various levels of system detalization. A sequence of tasks and stages of the formation of automated control system of space systems structure is developed. The authors have developed and proposed a scheme of the combined implementation of optimization and simulation models to ensure rational distribution of functions between the automated control system complex and the rest of the system units. The proposed approach allows to make reasonable hardware selection, taking into account the different requirements for the operation of automated control systems of space systems.

  18. Metacognition and abstract reasoning.

    Science.gov (United States)

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue.

  19. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    Science.gov (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time.

  20. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    of art. From Difference and Repetition to Anti-Oedipus, the machines are conceived as binary machines based on the exclusive or inclusive use respectively of the three syntheses: conexa, disjuncta and conjuncta. The machines have a twofold embedment: In the desiring-production and in the social...... production. In Kafka: Toward a Minor Literature, Deleuze and Guatari gave the most comprehensive explanation to the abstract machine in the work of art. Like the war-machines of Virilio, the Kafka-machine operates in three gears or speeds. Furthermore, the machine is connected to spatial diagrams......To most people the concept of abstract machines is connected to the name of Alan Turing and the development of the modern computer. The Turing machine is universal, axiomatic and symbolic (E.g. operating on symbols). Inspired by Foucault, Deleuze and Guattari extended the concept of abstract...

  1. Monadic abstract interpreters

    DEFF Research Database (Denmark)

    Sergey, Ilya; Devriese, Dominique; Might, Matthew;

    2013-01-01

    -bounding to be independent of any particular semantics. Monads become the unifying agent between these concepts and between semantics. For instance, by plugging the same “context-insensitivity monad” into a monadicallyparameterized semantics for Java or for the lambda calculus, it yields the expected context......-insensitive analysis. To achieve this unification, we develop a systematic method for transforming a concrete semantics into a monadically-parameterized abstract machine. Changing the monad changes the behavior of the machine. By changing the monad, we recover a spectrum of machines—from the original concrete...... semantics to a monovariant, flow- and context-insensitive static analysis with a singly-threaded heap and weak updates. The monadic parameterization also suggests an abstraction over the ubiquitous monotone fixed-point computation found in static analysis. This abstraction makes it straightforward...

  2. Model-based analysis of an automated changeover switching unit for a busbar. MODSAFE 2009 work report

    Energy Technology Data Exchange (ETDEWEB)

    Bjorkman, K.; Valkonen, J.; Ranta, J.

    2011-06-15

    Verification of digital instrumentation and control (I and C) systems is challenging, because programmable logic controllers enable complicated control functions and the state spaces (number of distinct values of inputs, outputs, and internal memory) of the designs become easily too large for comprehensive manual inspection. Model checking is a promising formal method that can be used for verifying the correctness of system designs. A number of efficient model checking systems are available offering analysis tools that are able to determine automatically whether a given state machine model satisfies the desired safety properties. Model checking can also handle delays and other time-related operations, which are crucial in safety I and C systems and challenging to design and verify. The system analysed in this research project is called 'automated changeover switching unit for a busbar' and its purpose is to switch the power feed to stand-by power supply in the event of voltage breaks. The system is modelled as a finite state machine and some of its key properties are verified with the NuSMV model checking tool. The time-dependent components are modelled to operate in discrete fixed-length time steps and the lengths of the timed functions are scaled to avoid state explosion and enable efficient model checking. (orig.)

  3. Automated quantification and sizing of unbranched filamentous cyanobacteria by model-based object-oriented image analysis.

    Science.gov (United States)

    Zeder, Michael; Van den Wyngaert, Silke; Köster, Oliver; Felder, Kathrin M; Pernthaler, Jakob

    2010-03-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-oriented image analysis to simultaneously determine (i) filament number, (ii) individual filament lengths, and (iii) the cumulative filament length of unbranched cyanobacterial morphotypes in fluorescent microscope images in a fully automated high-throughput manner. Special emphasis was placed on correct detection of overlapping objects by image analysis and on appropriate coverage of filament length distribution by using large composite images. The method was validated with a data set for Planktothrix rubescens from field samples and was compared with manual filament tracing, the line intercept method, and the Utermöhl counting approach. The computer program described allows batch processing of large images from any appropriate source and annotation of detected filaments. It requires no user interaction, is available free, and thus might be a useful tool for basic research and drinking water quality control.

  4. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    NARCIS (Netherlands)

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality, m

  5. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  6. MATHEMATICAL MODELING, AUTOMATION AND CONTROL OF THE BIOCONVERSION OF SORBITOL TO SORBOSE IN THE VITAMIN C PRODUCTION PROCESS I. MATHEMATICAL MODELING

    Directory of Open Access Journals (Sweden)

    Bonomi A.

    1997-01-01

    Full Text Available In 1990, the Biotechnology and the Control Systems Groups of IPT started developing a system for the control and automation of fermentation processes, applied to the oxidation of sorbitol to sorbose by the bacteria Gluconobacter oxydans, the microbial step of the vitamin C production process, that was chosen as a case study. Initially, a thirteen-parameter model was fitted to represent the batch operation of the system utilizing a nonlinear regression analysis, the flexible polyhedron method. Based on these results, a model for the continuous process (with the same kinetic equations was constructed and its optimum operating point obtained

  7. From nominal sets binding to functions and lambda-abstraction: connecting the logic of permutation models with the logic of functions

    CERN Document Server

    Dowek, Gilles

    2011-01-01

    Permissive-Nominal Logic (PNL) extends first-order predicate logic with term-formers that can bind names in their arguments. It takes a semantics in (permissive-)nominal sets. In PNL, the forall-quantifier or lambda-binder are just term-formers satisfying axioms, and their denotation is functions on nominal atoms-abstraction. Then we have higher-order logic (HOL) and its models in ordinary (i.e. Zermelo-Fraenkel) sets; the denotation of forall or lambda is functions on full or partial function spaces. This raises the following question: how are these two models of binding connected? What translation is possible between PNL and HOL, and between nominal sets and functions? We exhibit a translation of PNL into HOL, and from models of PNL to certain models of HOL. It is natural, but also partial: we translate a restricted subsystem of full PNL to HOL. The extra part which does not translate is the symmetry properties of nominal sets with respect to permutations. To use a little nominal jargon: we can translate na...

  8. Reasoning abstractly about resources

    Science.gov (United States)

    Clement, B.; Barrett, A.

    2001-01-01

    r describes a way to schedule high level activities before distributing them across multiple rovers in order to coordinate the resultant use of shared resources regardless of how each rover decides how to perform its activities. We present an algorithm for summarizing the metric resource requirements of an abstract activity based n the resource usages of its potential refinements.

  9. Beyond the abstractions?

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2006-01-01

      The anniversary of the International Journal of Lifelong Education takes place in the middle of a conceptual landslide from lifelong education to lifelong learning. Contemporary discourses of lifelong learning etc are however abstractions behind which new functions and agendas for adult education...

  10. ESPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-06-15

    The Proceedings on ESPR 2014 include abstracts concerning the following topics: pediatric imaging: thorax, cardiovascular system, CT-technique, head and neck, perinatal imaging, molecular imaging; interventional imaging; specific focus: muscoskeletal imaging in juvenile idiopathic arthritis; radiation protection; oncology; molecular imaging - nuclear medicine; uroradiology and abdominal imaging.

  11. 2002 NASPSA Conference Abstracts.

    Science.gov (United States)

    Journal of Sport & Exercise Psychology, 2002

    2002-01-01

    Contains abstracts from the 2002 conference of the North American Society for the Psychology of Sport and Physical Activity. The publication is divided into three sections: the preconference workshop, "Effective Teaching Methods in the Classroom;" symposia (motor development, motor learning and control, and sport psychology); and free…

  12. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Study on the Enrichment Regularity of Semicarbazide in Algae Tian Xiu-hui eta1. (1) Abstract Semicarbazide (SEM) in three kinds of representative algae (Nitzschia closterium, Tetraselmis chui and Dicrateria sp) and seawater was determined using ultra performance liquid chromatogram tandem mass spectrometry in this work. Accumulation of semicarbazide (SEM) in algae under laboratory conditions was studied.

  13. Abstraction through Game Play

    Science.gov (United States)

    Avraamidou, Antri; Monaghan, John; Walker, Aisha

    2012-01-01

    This paper examines the computer game play of an 11-year-old boy. In the course of building a virtual house he developed and used, without assistance, an artefact and an accompanying strategy to ensure that his house was symmetric. We argue that the creation and use of this artefact-strategy is a mathematical abstraction. The discussion…

  14. Parent Education: Abstract Bibliography.

    Science.gov (United States)

    Kremer, Barbara, Comp.

    This bibliography has been compiled to alert educators to parent education documents found in the ERIC microfiche collection and in journal literature. Abstracts of selected documents have been taken from "Research in Education (RIE)", and journal article citations from the "Current Index to Journals in Education (CIJE)". Included are published…

  15. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Study on the Long-distance Transportation of Argopecten irridians concentricus ShenQin-long(1) Abstract The experiment was carried out in order to improve the survival rate of the scallop. The result indicated that keeping the water temperature at 4℃ was reasonable for the long-distance transportation of the scallop with oxygenated bags.

  16. THE CHINA MEDICAL ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The purpose of the China Medical Abstracts (Internal Medicine) is to promote international exchange of works done by the Chinese medical profession in the field of internal medicine. The papers selected from journals represent the newest and most important advances and progress in various specialities in internal medicine.

  17. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Study on Optimization of Enzymic Preparation of Collagen Polypeptide from Skin of Gadous macrocephaius Liu Chun-e et al. (1) Abstract Enzymolysis was used to prepare collagen peptide. The optimum condition was determined based on one way ANOVA and orthogonal experimental design. The result indicated that use alkaline protease on the concentration of 4.5%,

  18. Full Abstraction for HOPLA

    DEFF Research Database (Denmark)

    Nygaard, Mikkel; Winskel, Glynn

    2003-01-01

    A fully abstract denotational semantics for the higher-order process language HOPLA is presented. It characterises contextual and logical equivalence, the latter linking up with simulation. The semantics is a clean, domain-theoretic description of processes as downwards-closed sets of computation...

  19. SPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-04-01

    The volume contains the abstracts of the SPR (society for pediatric radiology) 2015 meeting covering the following issues: fetal imaging, muscoskeletal imaging, cardiac imaging, chest imaging, oncologic imaging, tools for process improvement, child abuse, contrast enhanced ultrasound, image gently - update of radiation dose recording/reporting/monitoring - meaningful or useless meaning?, pediatric thoracic imaging, ALARA.

  20. Cambridge Scientific Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    正Meteorological and Environmental Research has been included by Cambridge Scientific Abstracts (CSA) since 2011. CSA is a retrieval system published by Cambridge Information Group. CSA was founded in the late 1950's,and became part of the CIG family in 1971. CSA's original mission was publishing secondary source materials relating to the physical sciences. Completely

  1. Cambridge Scientific Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Meteorological and Environmental Research has been included by Cambridge Scientific Abstracts (CSA) since 2011. CSA is a retrieval system published by Cambridge Information Group. CSA was founded in the late 1950’s,and became part of the CIG family in 1971. CSA’s original mission was publishing secondary source materials relating to the physical sciences. Completely

  2. Bounded Rationality of Generalized Abstract Fuzzy Economies

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2014-01-01

    Full Text Available By using a nonlinear scalarization technique, the bounded rationality model M for generalized abstract fuzzy economies in finite continuous spaces is established. Furthermore, by using the model M, some new theorems for structural stability and robustness to (λ,ϵ-equilibria of generalized abstract fuzzy economies are proved.

  3. The Complexity of Abstract Machines

    Directory of Open Access Journals (Sweden)

    Beniamino Accattoli

    2017-01-01

    Full Text Available The lambda-calculus is a peculiar computational model whose definition does not come with a notion of machine. Unsurprisingly, implementations of the lambda-calculus have been studied for decades. Abstract machines are implementations schema for fixed evaluation strategies that are a compromise between theory and practice: they are concrete enough to provide a notion of machine and abstract enough to avoid the many intricacies of actual implementations. There is an extensive literature about abstract machines for the lambda-calculus, and yet—quite mysteriously—the efficiency of these machines with respect to the strategy that they implement has almost never been studied. This paper provides an unusual introduction to abstract machines, based on the complexity of their overhead with respect to the length of the implemented strategies. It is conceived to be a tutorial, focusing on the case study of implementing the weak head (call-by-name strategy, and yet it is an original re-elaboration of known results. Moreover, some of the observation contained here never appeared in print before.

  4. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    Science.gov (United States)

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty

  5. Automated identification of stream-channel geomorphic features from high‑resolution digital elevation models in West Tennessee watersheds

    Science.gov (United States)

    Cartwright, Jennifer M.; Diehl, Timothy H.

    2017-01-17

    High-resolution digital elevation models (DEMs) derived from light detection and ranging (lidar) enable investigations of stream-channel geomorphology with much greater precision than previously possible. The U.S. Geological Survey has developed the DEM Geomorphology Toolbox, containing seven tools to automate the identification of sites of geomorphic instability that may represent sediment sources and sinks in stream-channel networks. These tools can be used to modify input DEMs on the basis of known locations of stormwater infrastructure, derive flow networks at user-specified resolutions, and identify possible sites of geomorphic instability including steep banks, abrupt changes in channel slope, or areas of rough terrain. Field verification of tool outputs identified several tool limitations but also demonstrated their overall usefulness in highlighting likely sediment sources and sinks within channel networks. In particular, spatial clusters of outputs from multiple tools can be used to prioritize field efforts to assess and restore eroding stream reaches.

  6. Toward Automated FAÇADE Texture Generation for 3d Photorealistic City Modelling with Smartphones or Tablet Pcs

    Science.gov (United States)

    Wang, S.

    2012-07-01

    An automated model-image fitting algorithm is proposed in this paper for generating façade texture image from pictures taken by smartphones or tablet PCs. The façade texture generation requires tremendous labour work and thus, has been the bottleneck of 3D photo-realistic city modelling. With advanced developments of the micro electro mechanical system (MEMS), camera, global positioning system (GPS), and gyroscope (G-sensors) can all be integrated into a smartphone or a table PC. These sensors bring the possibility of direct-georeferencing for the pictures taken by smartphones or tablet PCs. Since the accuracy of these sensors cannot compared to the surveying instruments, the image position and orientation derived from these sensors are not capable of photogrammetric measurements. This paper adopted the least-squares model-image fitting (LSMIF) algorithm to iteratively improve the image's exterior orientation. The image position from GPS and the image orientation from gyroscope are treated as the initial values. By fitting the projection of the wireframe model to the extracted edge pixels on image, the image exterior orientation elements are solved when the optimal fitting achieved. With the exact exterior orientation elements, the wireframe model of the building can be correctly projected on the image and, therefore, the façade texture image can be extracted from the picture.

  7. Abstracts of Main Essays

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Position of Capitalist Study in Marx's Social Formation Theory Yang Xue-gong Xi Da-min The orientation and achievements of Marx's study of Capitalism or bourgeois society is the foundation of his social formation theory. On the base of his scientific study of capitalism, Marx evolves his concept of eco- nomic social formation, the scientific methodology of researching other social formations or social forms, the clues of the development of social formations, the abstraction of the general laws as well as his reflection on this abstraction. A full evaluation and acknowledgement of the position of capitalist study in Marx's social formation theory is crucial for revising Marx's social formation theory in the new era and for solving some controversial issues in the research of social formation theory.

  8. Research Abstracts of 1982.

    Science.gov (United States)

    1982-12-01

    Activity in Hamsters (Abstract #667) 9. L. SIMONSON*, B. LAMBERTS, E. PEDERSON and D. REIHER’--*gEffect of Saliva and Sucrose on Adherence of S.. mutans to...presence of osteosclerosis and/or enlargement of periodontal ligament space; pain duration greater than one hour and spontaneous or severe pain; no...Unstimulated whole saliva was collected in chilled containers from 29 CF and 29 CA recruits, along with data on smoking habits. Flow rate, pH, OSCN

  9. Introduction to abstract analysis

    CERN Document Server

    Goldstein, Marvin E

    2015-01-01

    Developed from lectures delivered at NASA's Lewis Research Center, this concise text introduces scientists and engineers with backgrounds in applied mathematics to the concepts of abstract analysis. Rather than preparing readers for research in the field, this volume offers background necessary for reading the literature of pure mathematics. Starting with elementary set concepts, the treatment explores real numbers, vector and metric spaces, functions and relations, infinite collections of sets, and limits of sequences. Additional topics include continuity and function algebras, Cauchy complet

  10. Generalized Abstract Symbolic Summaries

    Science.gov (United States)

    Person, Suzette; Dwyer, Matthew B.

    2009-01-01

    Current techniques for validating and verifying program changes often consider the entire program, even for small changes, leading to enormous V&V costs over a program s lifetime. This is due, in large part, to the use of syntactic program techniques which are necessarily imprecise. Building on recent advances in symbolic execution of heap manipulating programs, in this paper, we develop techniques for performing abstract semantic differencing of program behaviors that offer the potential for improved precision.

  11. SPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-05-15

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  12. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  13. Automated Design Space Exploration with Aspen

    Directory of Open Access Journals (Sweden)

    Kyle L. Spafford

    2015-01-01

    Full Text Available Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation language with three new language constructs: user-defined resources, parameter ranges, and a collection of costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.

  14. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  15. Abstract Cauchy problems three approaches

    CERN Document Server

    Melnikova, Irina V

    2001-01-01

    Although the theory of well-posed Cauchy problems is reasonably understood, ill-posed problems-involved in a numerous mathematical models in physics, engineering, and finance- can be approached in a variety of ways. Historically, there have been three major strategies for dealing with such problems: semigroup, abstract distribution, and regularization methods. Semigroup and distribution methods restore well-posedness, in a modern weak sense. Regularization methods provide approximate solutions to ill-posed problems. Although these approaches were extensively developed over the last decades by many researchers, nowhere could one find a comprehensive treatment of all three approaches.Abstract Cauchy Problems: Three Approaches provides an innovative, self-contained account of these methods and, furthermore, demonstrates and studies some of the profound connections between them. The authors discuss the application of different methods not only to the Cauchy problem that is not well-posed in the classical sense, b...

  16. Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.

    Science.gov (United States)

    Rumbell, Timothy H; Draguljić, Danel; Yadav, Aniruddha; Hof, Patrick R; Luebke, Jennifer I; Weaver, Christina M

    2016-08-01

    Conductance-based compartment modeling requires tuning of many parameters to fit the neuron model to target electrophysiological data. Automated parameter optimization via evolutionary algorithms (EAs) is a common approach to accomplish this task, using error functions to quantify differences between model and target. We present a three-stage EA optimization protocol for tuning ion channel conductances and kinetics in a generic neuron model with minimal manual intervention. We use the technique of Latin hypercube sampling in a new way, to choose weights for error functions automatically so that each function influences the parameter search to a similar degree. This protocol requires no specialized physiological data collection and is applicable to commonly-collected current clamp data and either single- or multi-objective optimization. We applied the protocol to two representative pyramidal neurons from layer 3 of the prefrontal cortex of rhesus monkeys, in which action potential firing rates are significantly higher in aged compared to young animals. Using an idealized dendritic topology and models with either 4 or 8 ion channels (10 or 23 free parameters respectively), we produced populations of parameter combinations fitting the target datasets in less than 80 hours of optimization each. Passive parameter differences between young and aged models were consistent with our prior results using simpler models and hand tuning. We analyzed parameter values among fits to a single neuron to facilitate refinement of the underlying model, and across fits to multiple neurons to show how our protocol will lead to predictions of parameter differences with aging in these neurons.

  17. A Knowledge Based Approach for Automated Modelling of Extended Wing Structures in Preliminary Aircraft Design

    OpenAIRE

    Dorbath, Felix; Nagel, Björn; Gollnick, Volker

    2011-01-01

    This paper introduces the concept of the ELWIS model generator for Finite Element models of aircraft wing structures. The physical modelling of the structure is extended beyond the wing primary structures, to increase the level of accuracy for aircraft which diverge from existing configurations. Also the impact of novel high lift technologies on structural masses can be captured already in the early stages of design by using the ELWIS models. The ELWIS model generator is able to c...

  18. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  19. Forecasting performances of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2014-01-01

    into a linear model selection and estimation problem. To this end, we employ three automatic modelling devices. One of them is White’s QuickNet, but we also consider Autometrics, which is well known to time series econometricians, and the Marginal Bridge Estimator, which is better known to statisticians....... The performances of these three model selectors are compared by looking at the accuracy of the forecasts of the estimated neural network models. We apply the neural network model and the three modelling techniques to monthly industrial production and unemployment series from the G7 countries and the four......In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact...

  20. IPR 2016. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-05-15

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  1. Elements of abstract algebra

    CERN Document Server

    Clark, Allan

    1984-01-01

    This concise, readable, college-level text treats basic abstract algebra in remarkable depth and detail. An antidote to the usual surveys of structure, the book presents group theory, Galois theory, and classical ideal theory in a framework emphasizing proof of important theorems.Chapter I (Set Theory) covers the basics of sets. Chapter II (Group Theory) is a rigorous introduction to groups. It contains all the results needed for Galois theory as well as the Sylow theorems, the Jordan-Holder theorem, and a complete treatment of the simplicity of alternating groups. Chapter III (Field Theory)

  2. Abstracts of Major Articles

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On Problems in Fujian's Present Health Insurance Professionals and Related Suggestions LIN Deng-hui,WU Xiao-nan (School of Public Health, Fujian Medical University, Fuzhou 350108, China) Abstract:Based on a statistical analysis of questionnaire survey data collected from practitioners in Fu- jian's medical insurance management system, the paper discusses the problems relevant to the staff's qua lity structure in this industry as well as mechanisms for continuing education and motivation. Finally, the authors advance such suggestions as increasing the levels of practitioner's expertise and working capacity by developing disciplinary education and continuing motivated with a well-established motivation system. education, and encouraging employees to get highly

  3. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Establishment of a Method for Content Determination of Polysaccharide in Membranous milkveteh root Applied in Fisheries Yu Xiao-qing et al. (1) Abstract Some chemical component in the traditional Chinese medicine Membranous milkvetch root can improve the ability of disease-prevention of animal and it can be applied in fisheries. In the paper, the method about content determination of polysaccharide in the root was established based on orthogonal experimental design Key words medicine; polysaccharide in Membranous milkvetch root; method of determination

  4. ESPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-05-10

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  5. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  6. Forecasting performance of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    problem. To this end we employ three automatic modelling devices. One of them is White’s QuickNet, but we also consider Autometrics, well known to time series econometricians, and the Marginal Bridge Estimator, better known to statisticians and microeconometricians.The performance of these three model...... selectors is compared by looking at the accuracy of the forecasts of the estimated neural network models. We apply the neural network model and the three modelling techniques to monthly industrial production and unemployment series of the G7 countries and the four Scandinavian ones, and focus on forecasting......In this work we consider forecasting macroeconomic variables during an economic crisis. The focus is on a speci…c class of models, the so-called single hidden-layer feedforward autoregressive neural network models. What makes these models interesting in the present context is that they form a class...

  7. Interactional Metadiscourse in Research Article Abstracts

    Science.gov (United States)

    Gillaerts, Paul; Van de Velde, Freek

    2010-01-01

    This paper deals with interpersonality in research article abstracts analysed in terms of interactional metadiscourse. The evolution in the distribution of three prominent interactional markers comprised in Hyland's (2005a) model, viz. hedges, boosters and attitude markers, is investigated in three decades of abstract writing in the field of…

  8. Automated Test Case Generation

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  9. Comparison of different inspiratory triggering settings in automated ventilators during cardiopulmonary resuscitation in a porcine model

    Science.gov (United States)

    Fu, Yangyang; Sun, Feng; Zhang, Yazhi; Hu, Yingying; Walline, Joseph; Zhu, Huadong; Yu, Xuezhong

    2017-01-01

    Background Mechanical ventilation via automated in-hospital ventilators is quite common during cardiopulmonary resuscitation. It is not known whether different inspiratory triggering sensitivity settings of ordinary ventilators have different effects on actual ventilation, gas exchange and hemodynamics during resuscitation. Methods 18 pigs enrolled in this study were anaesthetized and intubated. Continuous chest compressions and mechanical ventilation (volume-controlled mode, 100% O2, respiratory rate 10/min, and tidal volumes 10ml/kg) were performed after 3 minutes of ventricular fibrillation. Group trig-4, trig-10 and trig-20 (six pigs each) were characterized by triggering sensitivities of 4, 10 and 20 (cmH2O for pressure-triggering and L/min for flow-triggering), respectively. Additionally, each pig in each group was mechanically ventilated using three types of inspiratory triggering (pressure-triggering, flow-triggering and turned-off triggering) of 5 minutes duration each, and each animal matched with one of six random assortments of the three different triggering settings. Blood gas samples, respiratory and hemodynamic parameters for each period were all collected and analyzed. Results In each group, significantly lower actual respiratory rate, minute ventilation volume, mean airway pressure, arterial pH, PaO2, and higher end-tidal carbon dioxide, aortic blood pressure, coronary perfusion pressure, PaCO2 and venous oxygen saturation were observed in the ventilation periods with a turned-off triggering setting compared to those with pressure- or flow- triggering (all P<0.05), except when compared with pressure-triggering of 20 cmH2O (respiratory rate 10.5[10/11.3]/min vs 12.5[10.8/13.3]/min, P = 0.07; coronary perfusion pressure 30.3[24.5/31.6] mmHg vs 27.4[23.7/29] mmHg, P = 0.173; venous oxygen saturation 46.5[32/56.8]% vs 41.5[33.5/48.5]%, P = 0.575). Conclusions Ventilation with pressure- or flow-triggering tends to induce hyperventilation and

  10. Role of hydrogen abstraction acetylene addition mechanisms in the formation of chlorinated naphthalenes. 2. Kinetic modeling and the detailed mechanism of ring closure.

    Science.gov (United States)

    McIntosh, Grant J; Russell, Douglas K

    2014-12-26

    The dominant formation mechanisms of chlorinated phenylacetylenes, naphthalenes, and phenylvinylacetylenes in relatively low pressure and temperature (∼40 Torr and 1000 K) pyrolysis systems are explored. Mechanism elucidation is achieved through a combination of theoretical and experimental techniques, the former employing a novel simplification of kinetic modeling which utilizes rate constants in a probabilistic framework. Contemporary formation schemes of the compounds of interest generally require successive additions of acetylene to phenyl radicals. As such, infrared laser powered homogeneous pyrolyses of dichloro- or trichloroethylene were perturbed with 1,2,4- or 1,2,3-trichlorobenzene. The resulting changes in product identities were compared with the major products expected from conventional pathways, aided by the results of our previous computational work. This analysis suggests that a Bittner-Howard growth mechanism, with a novel amendment to the conventional scheme made just prior to ring closure, describes the major products well. Expected products from a number of other potentially operative channels are shown to be incongruent with experiment, further supporting the role of Bittner-Howard channels as the unique pathway to naphthalene growth. A simple quantitative analysis which performs very well is achieved by considering the reaction scheme as a probability tree, with relative rate constants being cast as branching probabilities. This analysis describes all chlorinated phenylacetylene, naphthalene, and phenylvinylacetylene congeners. The scheme is then tested in a more general system, i.e., not enforcing a hydrogen abstraction/acetylene addition mechanism, by pyrolyzing mixtures of di- and trichloroethylene without the addition of an aromatic precursor. The model indicates that these mechanisms are still likely to be operative.

  11. iGen 0.1: a program for the automated generation of models and parameterisations

    Directory of Open Access Journals (Sweden)

    D. F. Tang

    2011-09-01

    Full Text Available Complex physical systems can often be simulated using very high resolution models but this is not always practical because of computational restrictions. In this case the model must be simplified or parameterised in order to make it computationally tractable. A parameterised model is created using an ad-hoc selection of techniques which range from the formal to the purely intuitive, and as a result it is very difficult to objectively quantify the fidelity of the model to the physical system. It is rare that a parameterised model can be formally shown to simulate a physical system to within some bounded error. Here we introduce a new approach to parameterising models which allows error to be formally bounded. The approach makes use of a newly developed computer program, which we call iGen, that analyses the source code of a high-resolution model and formally derives a much faster, parameterised model that closely approximates the original, reporting bounds on the error introduced by any approximations. These error bounds can be used to formally justify conclusions about a physical system based on observations of the model's behaviour. Using increasingly complex physical systems as examples we illustrate that iGen has the ability to produce parameterisations that run typically orders of magnitude faster than the underlying, high-resolution models from which they are derived.

  12. Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows

    Science.gov (United States)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2009-05-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block

  13. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Acute Toxicity Test of Four Drugs to Perch Fries Zhu You-fang et al(1) Abstract Acute toxicity test of four drugs to perch Lateolabrax maculates were studied. The results showed: (1)The LC50 values of perch to copper sulfate was 4.58 mg/L(24h), 2.93 mg/ L(48h), 1.81 mg/L(72h) and 0.78 mg/L (96h) respectively; to SHA CHONG WEI was 35.11 mg/ L(24h), 15.81 mg/L(48h), 11.20 mg/L(72h), 9.36 mg/ L (96h)respectively;

  14. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Determination of the Estrogen Alkylphenols and Bisphenol A in Marine Sediments by Gas Chromatography-Mass Spectrometry Deng Xu-xiu et al. (1) Abstract Octylphenol, nonylphenol and bisphenol A are recognized environmental endocrine disruptors. A quantitative method was established for the simultaneous determination of octylphenol, nonylphenol and bisphenol A in marine sediments by gas chromatography-mass spectrometry. The test sample was extracted by methanol with ultrasonic technique, purified with copper powder and carbon solid phase extraction column, and derived with heptafluorobutyric anhydride. Then the analytes were separated on HP-5ms column and determined by gas chromatography-mass. The recovery of the method was between 84.3% and 94.5%, and the LOQ of 4-N- octylphenol, nonylphenol and bisphenol A was 0.25 g/kg, 0.15 g/kg and 0.15 g/kg. Key words octylphenol; nonylphenol; bisphenol A; gas chromatography-mass spectrometry

  15. A LARI Experience (Abstract)

    Science.gov (United States)

    Cook, M.

    2015-12-01

    (Abstract only) In 2012, Lowell Observatory launched The Lowell Amateur Research Initiative (LARI) to formally involve amateur astronomers in scientific research by bringing them to the attention of and helping professional astronomers with their astronomical research. One of the LARI projects is the BVRI photometric monitoring of Young Stellar Objects (YSOs), wherein amateurs obtain observations to search for new outburst events and characterize the colour evolution of previously identified outbursters. A summary of the scientific and organizational aspects of this LARI project, including its goals and science motivation, the process for getting involved with the project, a description of the team members, their equipment and methods of collaboration, and an overview of the programme stars, preliminary findings, and lessons learned is presented.

  16. Contents and Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [Ancient Mediterranean Civilizations] Title: On Poseidon's Image in Homeric Epics Author: Zhu Yizhang, Lecturer, School of History and Culture, Shandong University, Jinan, Shandong, 250100, China. Abstract: Poseidon was an important role in religion, myth and literature of ancient Greece. His religious functions, and status in mythical image in literature were mainly established by Homeric Epics. Poseidon doesn't only appear frequently in the Homeric Epics but also influences the development of the plots directly; therefore, he could be seen as one of the most important gods in the Epics. But Homeric Epics do not introduce his basic image clearly. In Homeric Epics, Poseidon carries the deity and humanity aspect of the figure, and the latter was emphasized, which implied his archetype was a mortal wanax.

  17. Abstracts of Selected Papers

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On the Social Solidarity of Organization An Empirical Analysis Li Hanlin Abstract: Based on the 2002 survey data, this paper tries to measure solidarity in organization. The operationalization for this measurement goes from two points of view. One is from the degree of cohesion and another one is from the degree of vulnerability. To observe and measure the degree of cohesion three subscales like social support, vertical integration and organizational identity have been used. To observe and measure the degree of vulnerability other three subscales like dissatisfaction, relative deprivation and anomie have been used. The paper tries to explore finally under which condition the organization behavior and behavior orientation could go to the similarity or make some difference. Key words: Organization Cohesion Vulnerability Organization Behavior

  18. Prototype Task Network Model to Simulate the Analysis of Narrow Band Sonar Data and the Effects of Automation on Critical Operator Tasks

    Science.gov (United States)

    2016-06-07

    generate appropriate functions and tasks to be modelled. The Integrated Performance Modelling Environment (IPME) software was used to build a task...repeated, low-level cognitive analysis. This would free up operators to perform the more complex and critical tasks of analysing contact signatures...could be created by an automated function that deals with lines that are easy to detect and identify, thereby freeing the operator to focus on more

  19. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  20. AN OPTIMIZATION-BASED HEURISTIC FOR A CAPACITATED LOT-SIZING MODEL IN AN AUTOMATED TELLER MACHINES NETWORK

    Directory of Open Access Journals (Sweden)

    Supatchaya Chotayakul

    2013-01-01

    Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customer’s cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.

  1. Writing a successful research abstract.

    Science.gov (United States)

    Bliss, Donna Z

    2012-01-01

    Writing and submitting a research abstract provides timely dissemination of the findings of a study and offers peer input for the subsequent development of a quality manuscript. Acceptance of abstracts is competitive. Understanding the expected content of an abstract, the abstract review process and tips for skillful writing will improve the chance of acceptance.

  2. Automated Transformation of Distributed Software Architectural Models to Finite State Process

    Directory of Open Access Journals (Sweden)

    Omid Bushehrian,

    2010-12-01

    Full Text Available Software Performance Engineering (SPE represents the collection of software engineering activities with the purpose of identification, prediction and also improvement of software performance parameters in the early stages of software development life cycle. Various models such as queuing networks, layered queues, Petri Nets and Stochastic Process Algebras are suggested for modeling distributed systems. Particular ability of a model is the prediction and estimation of non-functional characteristic of one system before it has been made. The main problem is a method by which we can easily transform architectural software models into formal simulate able models.In this paper a method for automatic transformation of UML deployment and sequence diagrams into FSP(finite state process model is presented, so that we can analyze the resulting model through discrete event simulation tools from the performance perspective. In the proposed transformation algorithm, different aspects of a software system such as: communication model of software objects, synchronization and physical deployment of objects are considered.

  3. Application of Holdridge life-zone model based on the terrain factor in Xinjiang Automous Region

    Institute of Scientific and Technical Information of China (English)

    NI Yong-ming; OUYANG Zhi-yun; WANG Xiao-ke

    2005-01-01

    This study improved the application of the Holdridge life-zone model to simulate the distribution of desert vegetation in China which gives statistics to support eco-recovery and ecosystem reconstruction in desert area. This study classified the desert vegetation into four types: (1) LAD: little arbor desert; (2) SD: shrub desert; (3) HLHSD: half-shrub, little half-shrub desert; (4) LHSCD: little halfshrub cushion desert. Based on the classification of Xinjiang desert vegetation, the classical Holdridge life-zone model was used to simulate Xinjiang desert vegetation's distribution and compare the Kappa coefficient result of the model with table of accuracy represented by Kappa values. The Kappa value of the model was only 0.19, it means the simulation result was poor. To improve the life-zone model application to Xinjiang desert vegetation type, a set of plot standards for terrain factors was developed by using the plot standard as the reclassification criterion to climate sub-regime. Then the desert vegetation in Xinjiang was simulated. The average Kappa value of the second simulation to the respective climate regime was 0.45. The Kappa value of final modeling result was 0.64, which is the better value.The modification of the model made it in more application region. In the end, the model' s ecological relevance to the Xinjiang desert vegetation types was studied.

  4. Automated measurement of Drosophila wings

    Directory of Open Access Journals (Sweden)

    Mezey Jason

    2003-12-01

    Full Text Available Abstract Background Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. Results We have developed an automated image analysis system (WINGMACHINE that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. Conclusion WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  5. Heating automation

    OpenAIRE

    Tomažič, Tomaž

    2013-01-01

    This degree paper presents usage and operation of peripheral devices with microcontroller for heating automation. The main goal is to make a quality system control for heating three house floors and with that, increase efficiency of heating devices and lower heating expenses. Heat pump, furnace, boiler pump, two floor-heating pumps and two radiator pumps need to be controlled by this system. For work, we have chosen a development kit stm32f4 - discovery with five temperature sensors, LCD disp...

  6. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  7. Marketing automation

    OpenAIRE

    Raluca Dania TODOR

    2017-01-01

    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  8. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. (Chalmers Univ. of Technology. Division Design and Human factors. Dept. of Product and Production Development, Goeteborg (Sweden))

    2011-01-15

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operator's mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  9. A grounded theory of abstraction in artificial intelligence.

    Science.gov (United States)

    Zucker, Jean-Daniel

    2003-07-29

    In artificial intelligence, abstraction is commonly used to account for the use of various levels of details in a given representation language or the ability to change from one level to another while preserving useful properties. Abstraction has been mainly studied in problem solving, theorem proving, knowledge representation (in particular for spatial and temporal reasoning) and machine learning. In such contexts, abstraction is defined as a mapping between formalisms that reduces the computational complexity of the task at stake. By analysing the notion of abstraction from an information quantity point of view, we pinpoint the differences and the complementary role of reformulation and abstraction in any representation change. We contribute to extending the existing semantic theories of abstraction to be grounded on perception, where the notion of information quantity is easier to characterize formally. In the author's view, abstraction is best represented using abstraction operators, as they provide semantics for classifying different abstractions and support the automation of representation changes. The usefulness of a grounded theory of abstraction in the cartography domain is illustrated. Finally, the importance of explicitly representing abstraction for designing more autonomous and adaptive systems is discussed.

  10. Update on mathematical modeling research to support the development of automated insulin delivery systems.

    Science.gov (United States)

    Steil, Garry M; Hipszer, Brian; Reifman, Jaques

    2010-05-01

    One year after its initial meeting, the Glycemia Modeling Working Group reconvened during the 2009 Diabetes Technology Meeting in San Francisco, CA. The discussion, involving 39 scientists, again focused on the need for individual investigators to have access to the clinical data required to develop and refine models of glucose metabolism, the need to understand the differences among the distinct models and control algorithms, and the significance of day-to-day subject variability. The key conclusion was that model-based comparisons of different control algorithms, or the models themselves, are limited by the inability to access individual model-patient parameters. It was widely agreed that these parameters, as opposed to the average parameters that are typically reported, are necessary to perform such comparisons. However, the prevailing view was that, if investigators were to make the parameters available, it would limit their ability (and that of their institution) to benefit from the invested work in developing their models. A general agreement was reached regarding the importance of each model having an insulin pharmacokinetic/pharmacodynamic profile that is not different from profiles reported in the literature (88% of the respondents agreed that the model should have similar curves or be analyzed separately) and the importance of capturing intraday variance in insulin sensitivity (91% of the respondents indicated that this could result in changes in fasting glucose of >or=15%, with 52% of the respondents believing that the variability could effect changes of >or=30%). Seventy-six percent of the participants indicated that high-fat meals were thought to effect changes in other model parameters in addition to gastric emptying. There was also widespread consensus as to how a closed-loop controller should respond to day-to-day changes in model parameters (with 76% of the participants indicating that fasting glucose should be within 15% of target, with 30% of the

  11. Automating the segmentation of medical images for the production of voxel tomographic computational models.

    Science.gov (United States)

    Caon, M; Mohyla, J

    2001-12-01

    Radiation dosimetry for the diagnostic medical imaging procedures performed on humans requires anatomically accurate, computational models. These may be constructed from medical images as voxel-based tomographic models. However, they are time consuming to produce and as a consequence, there are few available. This paper discusses the emergence of semi-automatic segmentation techniques and describes an application (iRAD) written in Microsoft Visual Basic that allows the bitmap of a medical image to be segmented interactively and semi-automatically while displayed in Microsoft Excel. iRAD will decrease the time required to construct voxel models.

  12. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  13. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  14. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  15. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  16. Bringing Automated Model Checking to PLC Program Development - A CERN Case Study

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. Model checking appears to be an appropriate approach for this purpose. However, this technique is not widely used in industry yet, due to some obstacles. The main obstacles encountered when trying to apply formal verification techniques at industrial installations are the difficulty of creating models out of PLC programs and defining formally the specification requirements. In addition, models produced out of real-life programs have a huge state space, thus preventing the verification due to performance issues. Our work at CERN (European Organization for Nuclear Research) focuses on developing efficient automatic verification methods for industrial critical installations based on PLC (Programmable Logic Controller) control systems. In this paper, we present a tool generating automatically formal models out of PLC code. The tool implements a general methodology which can support several input languages, ...

  17. Reference Model of Desired Yaw Angle for Automated Lane Changing Behavior of Vehicle

    Institute of Scientific and Technical Information of China (English)

    Dianbo Ren; Guanzhe Zhang; Hangzhe Wu

    2016-01-01

    In this paper, it studies the problem of trajectory planning and tracking for lane changing behavior of vehicle in automatic highway systems. Based on the model of yaw angle acceleration with positive and negative trapezoid constraint, by analyzing the variation laws of yaw motion of vehicle during a lane changing maneuver, the reference model of desired yaw angle and yaw rate for lane changing is generated. According to the yaw angle model, the vertical and horizontal coordinates of trajectory for vehicle lane change are calculated. Assuming that the road curvature is a constant, the difference and associations between two scenarios are analyzed, the lane changing maneuvers occurred on curve road and straight road, respectively. On this basis, it deduces the calculation method of desired yaw angle for lane changing on circular road. Simulation result shows that, it is different from traditional lateral acceleration planning method with the trapezoid constraint, by applying the trapezoidal yaw acceleration reference model proposed in this paper, the resulting expected yaw angular acceleration is continuous, and the step tracking for steering angle is not needed to implement. Due to the desired yaw model is direct designed based on the variation laws of raw movement of vehicle during a lane changing maneuver, rather than indirectly calculated from the trajectory model for lane changing, the calculation steps are simplified.

  18. Automation of reverse engineering process in aircraft modeling and related optimization problems

    Science.gov (United States)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  19. Study on dynamic model of tractor system for automated navigation applications

    Institute of Scientific and Technical Information of China (English)

    FENG Lei; HE Yong

    2005-01-01

    This research aims at using a dynamic model of tractor system to support navigation system design for an automatically guided agricultural tractor. This model, consisting of a bicycle model of the tractor system, has been implemented in the MATLAB environment and was developed based on a John Deere tractor. The simulation results from this MATLAB model was validated through field navigation tests. The accuracy of the trajectory estimation is strongly affected by the determination of the cornering stiffness of the tractor. In this simulation, the tractor cornering stiffness analysis was identified during simulation analysis using the MATLAB model based on the recorded trajectory data. The obtained data was used in simulation analyses for various navigation operations in the field of interest. The analysis on field validation test results indicated that the developed tractor system could accurately estimate wheel trajectories of a tractor system while operating in agricultural fields at various speeds. The results also indicated that the developed system could accurately determine tractor velocity and steering angle while the tractor operates in curved fields.

  20. Constraint-Based Abstract Semantics for Temporal Logic

    DEFF Research Database (Denmark)

    Banda, Gourinath; Gallagher, John Patrick

    2010-01-01

    Abstract interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal mu-calculus, which is the basis for abstract model checking. The abstract semantic...... function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract...... model checking. Using the modal mu-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using...

  1. Stellar Presentations (Abstract)

    Science.gov (United States)

    Young, D.

    2015-12-01

    (Abstract only) The AAVSO is in the process of expanding its education, outreach and speakers bureau program. powerpoint presentations prepared for specific target audiences such as AAVSO members, educators, students, the general public, and Science Olympiad teams, coaches, event supervisors, and state directors will be available online for members to use. The presentations range from specific and general content relating to stellar evolution and variable stars to specific activities for a workshop environment. A presentation—even with a general topic—that works for high school students will not work for educators, Science Olympiad teams, or the general public. Each audience is unique and requires a different approach. The current environment necessitates presentations that are captivating for a younger generation that is embedded in a highly visual and sound-bite world of social media, twitter and U-Tube, and mobile devices. For educators, presentations and workshops for themselves and their students must support the Next Generation Science Standards (NGSS), the Common Core Content Standards, and the Science Technology, Engineering and Mathematics (STEM) initiative. Current best practices for developing relevant and engaging powerpoint presentations to deliver information to a variety of targeted audiences will be presented along with several examples.

  2. Judgement of abstract paintings

    Directory of Open Access Journals (Sweden)

    Dakulović Sandra

    2006-01-01

    Full Text Available In two experiments the judgement of twenty one abstract paintings was investigated. In Experiment 1, subjects were asked to make similarity judgements of 210 pairs of paintings on a 7 step bipolar scale (similar-dissimilar. The Multi-dimensional scaling (MDS method was used for data analysis. The distribution of paintings within MDS 2-D space suggested two grouping criteria: colorfullness (e.g. from Klee to Kline and geometrization (e.g. from Vasarely to Kandinsky. In Experiment 2, subjects were asked to judge the same paintings on three factors of the instrument SDF 9 (Marković et al., 2002b: Evaluation, Arousal and Regularity. The purpose of this experiment was to specify the subjective criteria on which the (dissimilarity judgements were based. In the regression analysis the three factors of SDF 9 were defined as predictors, whereas the x and y coordinates of MDS 2-D space were defined as dependent variables. The results have shown that the dimension x was reducible to the Evaluation factor, and dimension y is reducible to the Regularity factor.

  3. Exoplanets and Multiverses (Abstract)

    Science.gov (United States)

    Trimble, V.

    2016-12-01

    (Abstract only) To the ancients, the Earth was the Universe, of a size to be crossed by a god in a day, by boat or chariot, and by humans in a lifetime. Thus an exoplanet would have been a multiverse. The ideas gradually separated over centuries, with gradual acceptance of a sun-centered solar system, the stars as suns likely to have their own planets, other galaxies beyond the Milky Way, and so forth. And whenever the community divided between "just one' of anything versus "many," the "manies" have won. Discoveries beginning in 1991 and 1995 have gradually led to a battalion or two of planets orbiting other stars, very few like our own little family, and to moderately serious consideration of even larger numbers of other universes, again very few like our own. I'm betting, however, on habitable (though not necessarily inhabited) exoplanets to be found, and habitable (though again not necessarily inhabited) universes. Only the former will yield pretty pictures.

  4. Automated recognition of bird song elements from continuous recordings using dynamic time warping and hidden Markov models: a comparative study.

    Science.gov (United States)

    Kogan, J A; Margoliash, D

    1998-04-01

    The performance of two techniques is compared for automated recognition of bird song units from continuous recordings. The advantages and limitations of dynamic time warping (DTW) and hidden Markov models (HMMs) are evaluated on a large database of male songs of zebra finches (Taeniopygia guttata) and indigo buntings (Passerina cyanea), which have different types of vocalizations and have been recorded under different laboratory conditions. Depending on the quality of recordings and complexity of song, the DTW-based technique gives excellent to satisfactory performance. Under challenging conditions such as noisy recordings or presence of confusing short-duration calls, good performance of the DTW-based technique requires careful selection of templates that may demand expert knowledge. Because HMMs are trained, equivalent or even better performance of HMMs can be achieved based only on segmentation and labeling of constituent vocalizations, albeit with many more training examples than DTW templates. One weakness in HMM performance is the misclassification of short-duration vocalizations or song units with more variable structure (e.g., some calls, and syllables of plastic songs). To address these and other limitations, new approaches for analyzing bird vocalizations are discussed.

  5. Simulation of Regionally Ecological Land Based on a Cellular Automation Model: A Case Study of Beijing, China

    Directory of Open Access Journals (Sweden)

    Xiubin Li

    2012-08-01

    Full Text Available Ecological land is like the “liver” of a city and is very useful to public health. Ecological land change is a spatially dynamic non-linear process under the interaction between natural and anthropogenic factors at different scales. In this study, by setting up natural development scenario, object orientation scenario and ecosystem priority scenario, a Cellular Automation (CA model has been established to simulate the evolution pattern of ecological land in Beijing in the year 2020. Under the natural development scenario, most of ecological land will be replaced by construction land and crop land. But under the scenarios of object orientation and ecosystem priority, the ecological land area will increase, especially under the scenario of ecosystem priority. When considering the factors such as total area of ecological land, loss of key ecological land and spatial patterns of land use, the scenarios from priority to inferiority are ecosystem priority, object orientation and natural development, so future land management policies in Beijing should be focused on conversion of cropland to forest, wetland protection and prohibition of exploitation of natural protection zones, water source areas and forest parks to maintain the safety of the regional ecosystem.

  6. Modeling biology with HDL languages: a first step toward a genetic design automation tool inspired from microelectronics.

    Science.gov (United States)

    Gendrault, Yves; Madec, Morgan; Lallement, Christophe; Haiech, Jacques

    2014-04-01

    Nowadays, synthetic biology is a hot research topic. Each day, progresses are made to improve the complexity of artificial biological functions in order to tend to complex biodevices and biosystems. Up to now, these systems are handmade by bioengineers, which require strong technical skills and leads to nonreusable development. Besides, scientific fields that share the same design approach, such as microelectronics, have already overcome several issues and designers succeed in building extremely complex systems with many evolved functions. On the other hand, in systems engineering and more specifically in microelectronics, the development of the domain has been promoted by both the improvement of technological processes and electronic design automation tools. The work presented in this paper paves the way for the adaptation of microelectronics design tools to synthetic biology. Considering the similarities and differences between the synthetic biology and microelectronics, the milestones of this adaptation are described. The first one concerns the modeling of biological mechanisms. To do so, a new formalism is proposed, based on an extension of the generalized Kirchhoff laws to biology. This way, a description of all biological mechanisms can be made with languages widely used in microelectronics. Our approach is therefore successfully validated on specific examples drawn from the literature.

  7. Laser performance operations model (LPOM): The computational system that automates the setup and performance analysis of the National Ignition Facility

    Science.gov (United States)

    Shaw, Michael; House, Ronald

    2015-02-01

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF is the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed that automates the laser setup process, and accurately predict laser energetics. LPOM uses diagnostic feedback from previous NIF shots to maintain accurate energetics models (gains and losses), as well as links to operational databases to provide `as currently installed' optical layouts for each of the 192 NIF beamlines. LPOM deploys a fully integrated laser physics model, the Virtual Beamline (VBL), in its predictive calculations in order to meet the accuracy requirements of NIF experiments, and to provide the ability to determine the damage risk to optical elements throughout the laser chain. LPOM determines the settings of the injection laser system required to achieve the desired laser output, provides equipment protection, and determines the diagnostic setup. Additionally, LPOM provides real-time post shot data analysis and reporting for each NIF shot. The LPOM computation system is designed as a multi-host computational cluster (with 200 compute nodes, providing the capability to run full NIF simulations fully parallel) to meet the demands of both the controls systems within a shot cycle, and the NIF user community outside of a shot cycle.

  8. An automated nowcasting model of significant instability events in the flight terminal area of Rio de Janeiro, Brazil

    Science.gov (United States)

    Borges França, Gutemberg; Valdonel de Almeida, Manoel; Rosette, Alessana C.

    2016-05-01

    This paper presents a novel model, based on neural network techniques, to produce short-term and local-specific forecasts of significant instability for flights in the terminal area of Galeão Airport, Rio de Janeiro, Brazil. Twelve years of data were used for neural network training/validation and test. Data are originally from four sources: (1) hourly meteorological observations from surface meteorological stations at five airports distributed around the study area; (2) atmospheric profiles collected twice a day at the meteorological station at Galeão Airport; (3) rain rate data collected from a network of 29 rain gauges in the study area; and (4) lightning data regularly collected by national detection networks. An investigation was undertaken regarding the capability of a neural network to produce early warning signs - or as a nowcasting tool - for significant instability events in the study area. The automated nowcasting model was tested using results from five categorical statistics, indicated in parentheses in forecasts of the first, second, and third hours, respectively, namely proportion correct (0.99, 0.97, and 0.94), BIAS (1.10, 1.42, and 2.31), the probability of detection (0.79, 0.78, and 0.67), false-alarm ratio (0.28, 0.45, and 0.73), and threat score (0.61, 0.47, and 0.25). Possible sources of error related to the test procedure are presented and discussed. The test showed that the proposed model (or neural network) can grab the physical content inside the data set, and its performance is quite encouraging for the first and second hours to nowcast significant instability events in the study area.

  9. The Operations Intent and Effects Model: A Command and Control Methodology for Increased Automation

    Science.gov (United States)

    2013-06-01

    graphics overlays, in matrices, or described textually . An End-State can either be direct (e.g. destroying a bridge, when the bridge is destroyed the end...amongst C2 information elements along with C2 processes in a cohesive way and provides: 1) a model of Intent for C2; and, 2) a foundation for developing

  10. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.

    2015-01-01

    Background: Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector...

  11. Meaning-Based Scoring: A Systemic Functional Linguistics Model for Automated Test Tasks

    Science.gov (United States)

    Gleason, Jesse

    2014-01-01

    Communicative approaches to language teaching that emphasize the importance of speaking (e.g., task-based language teaching) require innovative and evidence-based means of assessing oral language. Nonetheless, research has yet to produce an adequate assessment model for oral language (Chun 2006; Downey et al. 2008). Limited by automatic speech…

  12. Automating the Simulation of SME Processes through a Discrete Event Parametric Model

    Directory of Open Access Journals (Sweden)

    Francesco Aggogeri

    2015-02-01

    Full Text Available At the factory level, the manufacturing system can be described as a group of processes governed by complex weaves of engineering strategies and technologies. Decision- making processes involve a lot of information, driven by managerial strategies, technological implications and layout constraints. Many factors affect decisions, and their combination must be carefully managed to determine the best solutions to optimize performances. In this way, advanced simulation tools could support the decisional process of many SMEs. The accessibility of these tools is limited by knowledge, cost, data availability and development time. These tools should be used to support strategic decisions rather than specific situations. In this paper, a novel approach is proposed that aims to facilitate the simulation of manufacturing processes by fast modelling and evaluation. The idea is to realize a model that is able to be automatically adapted to the user’s specific needs. The model must be characterized by a high degree of flexibility, configurability and adaptability in order to automatically simulate multiple/heterogeneous industrial scenarios. In this way, even a SME can easily access a complex tool, perform thorough analyses and be supported in taking strategic decisions. The parametric DES model is part of a greater software platform developed during COPERNICO EU funded project.

  13. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    Science.gov (United States)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  14. Bilateral Image Subtraction and Multivariate Models for the Automated Triaging of Screening Mammograms

    Directory of Open Access Journals (Sweden)

    José Celaya-Padilla

    2015-01-01

    Full Text Available Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. This work presents a computer-aided diagnosis (CADx method aimed to automatically triage mammogram sets. The method coregisters the left and right mammograms, extracts image features, and classifies the subjects into risk of having malignant calcifications (CS, malignant masses (MS, and healthy subject (HS. In this study, 449 subjects (197 CS, 207 MS, and 45 HS from a public database were used to train and evaluate the CADx. Percentile-rank (p-rank and z-normalizations were used. For the p-rank, the CS versus HS model achieved a cross-validation accuracy of 0.797 with an area under the receiver operating characteristic curve (AUC of 0.882; the MS versus HS model obtained an accuracy of 0.772 and an AUC of 0.842. For the z-normalization, the CS versus HS model achieved an accuracy of 0.825 with an AUC of 0.882 and the MS versus HS model obtained an accuracy of 0.698 and an AUC of 0.807. The proposed method has the potential to rank cases with high probability of malignant findings aiding in the prioritization of radiologists work list.

  15. Natural Environment Modeling and Fault-Diagnosis for Automated Agricultural Vehicle

    DEFF Research Database (Denmark)

    Blas, Morten Rufus; Blanke, Mogens

    2008-01-01

    This paper presents results for an automatic navigation system for agricultural vehicles. The system uses stereo-vision, inertial sensors and GPS. Special emphasis has been placed on modeling the natural environment in conjunction with a fault-tolerant navigation system. The results are exemplified...

  16. DATA FOR ENVIRONMENTAL MODELING (D4EM): BACKGROUND AND EXAMPLE APPLICATIONS OF DATA AUTOMATION

    Science.gov (United States)

    Data is a basic requirement for most modeling applications. Collecting data is expensive and time consuming. High speed internet connections and growing databases of online environmental data go a long way to overcoming issues of data scarcity. Among the obstacles still remaining...

  17. Automating CPM-GOMS

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  18. Verification of Statecharts Using Data Abstraction

    Directory of Open Access Journals (Sweden)

    Steffen Helke

    2016-01-01

    Full Text Available We present an approach for verifying Statecharts including infinite data spaces. We devise a technique for checking that a formula of the universal fragment of CTL is satisfied by a specification written as a Statechart. The approach is based on a property-preserving abstraction technique that additionally preserves structure. It is prototypically implemented in a logic-based framework using a theorem prover and a model checker. This paper reports on the following results. (1 We present a proof infra-structure for Statecharts in the theorem prover Isabelle/HOL, which constitutes a basis for defining a mechanised data abstraction process. The formalisation is based on Hierar-chical Automata (HA which allow a structural decomposition of Statecharts into Sequential Automata. (2 Based on this theory we introduce a data abstraction technique, which can be used to abstract the data space of a HA for a given abstraction function. The technique is based on constructing over-approximations. It is structure-preserving and is designed in a compositional way. (3 For reasons of practicability, we finally present two tactics supporting the abstraction that we have implemented in Isabelle/HOL. To make proofs more efficient, these tactics use the model checker SMV checking abstract models automatically.

  19. Automated Vision Test Development and Validation

    Science.gov (United States)

    2016-11-01

    AFRL-SA-WP-SR-2016-0020 Automated Vision Test Development and Validation Steve Wright, Darrell Rousse, Alex van Atta...Special Report 3. DATES COVERED (From – To) April 2014 – April 2015 4. TITLE AND SUBTITLE Automated Vision Test Development and Validation 5a...23 Nov 2016. 14. ABSTRACT The Optec Vision Test , originally produced in 1951 as the Armed Forces Vision Tester, is the sole device used to qualify

  20. Automated sleep spindle detection using IIR filters and a Gaussian Mixture Model.

    Science.gov (United States)

    Patti, Chanakya Reddy; Penzel, Thomas; Cvetkovic, Dean

    2015-08-01

    Sleep spindle detection using modern signal processing techniques such as the Short-Time Fourier Transform and Wavelet Analysis are common research methods. These methods are computationally intensive, especially when analysing data from overnight sleep recordings. The authors of this paper propose an alternative using pre-designed IIR filters and a multivariate Gaussian Mixture Model. Features extracted with IIR filters are clustered using a Gaussian Mixture Model without the use of any subject independent thresholds. The Algorithm was tested on a database consisting of overnight sleep PSG of 5 subjects and an online public spindles database consisting of six 30 minute sleep excerpts. An overall sensitivity of 57% and a specificity of 98.24% was achieved in the overnight database group and a sensitivity of 65.19% at a 16.9% False Positive proportion for the 6 sleep excerpts.

  1. Automated Object-Oriented Simulation Framework for Modelling of Superconducting Magnets at CERN

    CERN Document Server

    Maciejewski, Michał; Bartoszewicz, Andrzej

    The thesis aims at designing a flexible, extensible, user-friendly interface to model electro thermal transients occurring in superconducting magnets. Simulations are a fundamental tool for assessing the performance of a magnet and its protection system against the effects of a quench. The application is created using scalable and modular architecture based on object-oriented programming paradigm which opens an easy way for future extensions. What is more, each model composed of thousands of blocks is automatically created in MATLAB/Simulink. Additionally, the user is able to automatically run sets of simulations with varying parameters. Due to its scalability and modularity the framework can be easily used to simulate wide range of materials and magnet configurations.

  2. Multi-Instance Learning Models for Automated Support of Analysts in Simulated Surveillance Environments

    Science.gov (United States)

    Birisan, Mihnea; Beling, Peter

    2011-01-01

    New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.

  3. Evaluation of automated statistical shape model based knee kinematics from biplane fluoroscopy

    DEFF Research Database (Denmark)

    Baka, Nora; Kaptein, Bart L.; Giphart, J. Erik;

    2014-01-01

    State-of-the-art fluoroscopic knee kinematic analysis methods require the patient-specific bone shapes segmented from CT or MRI. Substituting the patient-specific bone shapes with personalizable models, such as statistical shape models (SSM), could eliminate the CT/MRI acquisitions, and thereby...... decrease costs and radiation dose (when eliminating CT). SSM based kinematics, however, have not yet been evaluated on clinically relevant joint motion parameters. Therefore, in this work the applicability of SSMs for computing knee kinematics from biplane fluoroscopic sequences was explored. Kinematic......-posterior tibial drawer, joint distraction-contraction, flexion, tibial rotation and adduction. The relationship between kinematic precision and bone shape accuracy was also investigated. The SSM based kinematics resulted in sub-millimeter (0.48-0.81mm) and approximately 1° (0.69-0.99°) median precision...

  4. An automated model for rooftop PV systems assessment in ArcGIS using LIDAR

    Directory of Open Access Journals (Sweden)

    Mesude Bayrakci Boz

    2015-08-01

    Full Text Available As photovoltaic (PV systems have become less expensive, building rooftops have come to be attractive for local power production. Identifying rooftops suitable for solar energy systems over large geographic areas is needed for cities to obtain more accurate assessments of production potential and likely patterns of development. This paper presents a new method for extracting roof segments and locating suitable areas for PV systems using Light Detection and Ranging (LIDAR data and building footprints. Rooftop segments are created using seven slope (tilt, ve aspect (azimuth classes and 6 different building types. Moreover, direct beam shading caused by nearby objects and the surrounding terrain is taken into account on a monthly basis. Finally, the method is implemented as an ArcGIS model in ModelBuilder and a tool is created. In order to show its validity, the method is applied to city of Philadelphia, PA, USA with the criteria of slope, aspect, shading and area used to locate suitable areas for PV system installation. The results show that 33.7% of the buildings footprints areas and 48.6% of the rooftop segments identi ed is suitable for PV systems. Overall, this study provides a replicable model using commercial software that is capable of extracting individual roof segments with more detailed criteria across an urban area.

  5. Using an automated emboli detection device in a porcine cardiopulmonary bypass (CPB) model: feasibility and considerations.

    Science.gov (United States)

    Schnürer, Christian; Gyoeri, Georg; Hager, Martina; Jeller, Anton; Moser, Patrizia L; Velik-Salchner, Corinna; Laufer, Guenther; Lorenz, Ingo H; Kolbitsch, Christian

    2007-12-01

    The significant risk of cerebral embolism during cardiopulmonary bypass (CPB) makes monitoring of embolic events advisable already when developing new operation and coagulation management strategies for example in CPB animal models. The present study therefore evaluated in a porcine CPB model the feasibility of bilateral epicarotid Doppler signal recording and the quality of manual or automatic emboli detection. A total of 42 recordings (e.g. right carotid artery (n = 20), left carotid artery (n = 22)) were evaluated. The frequency of emboli counts was comparable for both carotid arteries. Automatic emboli detection, however, found significantly more embolic events per pig than did post-hoc manual off-line analysis of the recordings (172 +/- 217 vs. 13 +/-10). None of the brains, however, showed any emboli or infarction area either in cross-examination or in histological evaluation. In conclusion, the present study showed the feasibility of using an epicarotid Doppler device for bilateral emboli detection in a porcine CPB model. Automatic on-line emboli detection, however, reported more embolic events than did post hoc, off-line manual analysis. Possible reasons for this discrepancy are discussed.

  6. Automated grid generation from models of complex geologic structure and stratigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Gable, C.; Trease, H.; Cherry, T.

    1996-04-01

    The construction of computational grids which accurately reflect complex geologic structure and stratigraphy for flow and transport models poses a formidable task. With an understanding of stratigraphy, material properties and boundary and initial conditions, the task of incorporating this data into a numerical model can be difficult and time consuming. Most GIS tools for representing complex geologic volumes and surfaces are not designed for producing optimal grids for flow and transport computation. We have developed a tool, GEOMESH, for generating finite element grids that maintain the geometric integrity of input volumes, surfaces, and geologic data and produce an optimal (Delaunay) tetrahedral grid that can be used for flow and transport computations. GEOMESH also satisfies the constraint that the geometric coupling coefficients of the grid are positive for all elements. GEOMESH generates grids for two dimensional cross sections, three dimensional regional models, represents faults and fractures, and has the capability of including finer grids representing tunnels and well bores into grids. GEOMESH also permits adaptive grid refinement in three dimensions. The tools to glue, merge and insert grids together demonstrate how complex grids can be built from simpler pieces. The resulting grid can be utilized by unstructured finite element or integrated finite difference computational physics codes.

  7. Automated generation of node-splitting models for assessment of inconsistency in network meta-analysis.

    Science.gov (United States)

    van Valkenhoef, Gert; Dias, Sofia; Ades, A E; Welton, Nicky J

    2016-03-01

    Network meta-analysis enables the simultaneous synthesis of a network of clinical trials comparing any number of treatments. Potential inconsistencies between estimates of relative treatment effects are an important concern, and several methods to detect inconsistency have been proposed. This paper is concerned with the node-splitting approach, which is particularly attractive because of its straightforward interpretation, contrasting estimates from both direct and indirect evidence. However, node-splitting analyses are labour-intensive because each comparison of interest requires a separate model. It would be advantageous if node-splitting models could be estimated automatically for all comparisons of interest. We present an unambiguous decision rule to choose which comparisons to split, and prove that it selects only comparisons in potentially inconsistent loops in the network, and that all potentially inconsistent loops in the network are investigated. Moreover, the decision rule circumvents problems with the parameterisation of multi-arm trials, ensuring that model generation is trivial in all cases. Thus, our methods eliminate most of the manual work involved in using the node-splitting approach, enabling the analyst to focus on interpreting the results.

  8. Compilation of Thesis Abstracts

    Science.gov (United States)

    2005-09-01

    that wavelet - based techniques show potential in providing viable information for these acoustic signals, despite the lack of statistical analysis...29 Applying Counterinsurgency Theory to Air Base Defense: A New Doctrinal... OFDM ) Systems....................................................................................................34 Modeling, Simulation and

  9. Calibrating Charisma: The many-facet Rasch model for leader measurement and automated coaching

    Science.gov (United States)

    Barney, Matt

    2016-11-01

    No one is a leader unless others follow. Consequently, leadership is fundamentally a social judgment construct, and may be best measured via a Many Facet Rasch Model designed for this purpose. Uniquely, the MFRM allows for objective, accurate and precise estimation of leader attributes, along with identification of rater biases and other distortions of the available information. This presentation will outline a mobile computer-adaptive measurement system that measures and develops charisma, among others. Uniquely, the approach calibrates and mass-personalizes artificially intelligent, Rasch-calibrated electronic coaching that is neither too hard nor too easy but “just right” to help each unique leader develop improved charisma.

  10. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm. PMID:27632581

  11. Automated species recognition of antbirds in a Mexican rainforest using hidden Markov models.

    Science.gov (United States)

    Trifa, Vlad M; Kirschel, Alexander N G; Taylor, Charles E; Vallejo, Edgar E

    2008-04-01

    Behavioral and ecological studies would benefit from the ability to automatically identify species from acoustic recordings. The work presented in this article explores the ability of hidden Markov models to distinguish songs from five species of antbirds that share the same territory in a rainforest environment in Mexico. When only clean recordings were used, species recognition was nearly perfect, 99.5%. With noisy recordings, performance was lower but generally exceeding 90%. Besides the quality of the recordings, performance has been found to be heavily influenced by a multitude of factors, such as the size of the training set, the feature extraction method used, and number of states in the Markov model. In general, training with noisier data also improved recognition in test recordings, because of an increased ability to generalize. Considerations for improving performance, including beamforming with sensor arrays and design of preprocessing methods particularly suited for bird songs, are discussed. Combining sensor network technology with effective event detection and species identification algorithms will enable observation of species interactions at a spatial and temporal resolution that is simply impossible with current tools. Analysis of animal behavior through real-time tracking of individuals and recording of large amounts of data with embedded devices in remote locations is thus a realistic goal.

  12. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    Science.gov (United States)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  13. Automated CT segmentation of diseased hip using hierarchical and conditional statistical shape models.

    Science.gov (United States)

    Yokota, Futoshi; Okada, Toshiyuki; Takao, Masaki; Sugano, Nobuhiko; Tada, Yukio; Tomiyama, Noriyuki; Sato, Yoshinobu

    2013-01-01

    Segmentation of the femur and pelvis is a prerequisite for patient-specific planning and simulation for hip surgery. Accurate boundary determination of the femoral head and acetabulum is the primary challenge in diseased hip joints because of deformed shapes and extreme narrowness of the joint space. To overcome this difficulty, we investigated a multi-stage method in which the hierarchical hip statistical shape model (SSM) is initially utilized to complete segmentation of the pelvis and distal femur, and then the conditional femoral head SSM is used under the condition that the regions segmented during the previous stage are known. CT data from 100 diseased patients categorized on the basis of their disease type and severity, which included 200 hemi-hips, were used to validate the method, which delivered significantly increased segmentation accuracy for the femoral head.

  14. A functional tolerance model: an approach to automate the inspection process

    Directory of Open Access Journals (Sweden)

    R. Hunter

    2008-12-01

    Full Text Available Purpose: Purpose of this paper is the definition of a framework to describe the Technological ProductSpecifications (TPS and the information associated with the geometric dimensioning and tolerancing tointegrate the design concepts into a commercial inspection system.Design/methodology/approach: A functional tolerance model provides a complete framework to define thegeometric dimensioning and tolerancing and its relationship with the part geometry and the inspection process.This framework establishes a connection between a computer aided design and computer aided inspectionsystem throughout the exportation of the information associated to the dimensions and tolerance of the part intoa commercial CAI system.Findings: They are mainly focused on the definition of a framework that describes the relationship between theentities of dimensions and tolerances with the geometry of the part. The information imported to a CAI system allowsto develop the inspection process without the additional information provided by a physical drawing of the part.Research limitations/implications: They regard the limited access to commercial CAI system and to the lackof protocols of exchange of data associated to the tolerances of the part.Practical implications: They involve facilitation of the inspection process development. This implicationallows realizing the inspection process reducing the time spent to define the geometry to inspect and theparameters that must be controlled.Originality/value: The main value of this research is the development of a unique framework to extract theinformation related to the geometric dimensioning and tolerances and the geometry of the part in a commonmodel. This model provides a complete definition and representation of the entities, attributes and relationshipof design and inspection system.

  15. Prediction method abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  16. Contents & Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    China FDI Spatial Agglomeration and Trend Surface;Fragmentation of Production, New Economy Geography and Industry Location: Theory Model and Empirical Evidence;Open Source Software: An Analysis on the Cyber Community as an Producer;International Literature Citation, Technical Knowledge Diffusion and China's Technology Innovation;

  17. Abstract graph transformation

    NARCIS (Netherlands)

    Rensink, Arend; Distefano, Dino

    2005-01-01

    Graphs may be used as representations of system states in operational semantics and model checking; in the latter context, they are being investigated as an alternative to bit vectors. The corresponding transitions are obtained as derivations from graph production rules. In this paper we propose an

  18. Abstract Graph Transformation

    NARCIS (Netherlands)

    Rensink, Arend; Distefano, Dino; Mukhopadhyay, S.; Roychoudhury, A.; Yang, Z.

    2006-01-01

    Graphs may be used as representations of system states in operational semantics and model checking; in the latter context, they are being investigated as an alternative to bit vectors. The corresponding transitions are obtained as derivations from graph production rules. In this paper we propose an

  19. Abstraction in artificial intelligence and complex systems

    CERN Document Server

    Saitta, Lorenza

    2013-01-01

    Abstraction is a fundamental mechanism underlying both human and artificial perception, representation of knowledge, reasoning and learning. This mechanism plays a crucial role in many disciplines, notably Computer Programming, Natural and Artificial Vision, Complex Systems, Artificial Intelligence and Machine Learning, Art, and Cognitive Sciences. This book first provides the reader with an overview of the notions of abstraction proposed in various disciplines by comparing both commonalities and differences.  After discussing the characterizing properties of abstraction, a formal model, the K

  20. An automation of design and modelling tasks in NX Siemens environment with original software - cost module

    Science.gov (United States)

    Zbiciak, R.; Grabowik, C.; Janik, W.

    2015-11-01

    The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing

  1. Automated Method for Estimating Nutation Time Constant Model Parameters for Spacecraft Spinning on Axis

    Science.gov (United States)

    2008-01-01

    Calculating an accurate nutation time constant (NTC), or nutation rate of growth, for a spinning upper stage is important for ensuring mission success. Spacecraft nutation, or wobble, is caused by energy dissipation anywhere in the system. Propellant slosh in the spacecraft fuel tanks is the primary source for this dissipation and, if it is in a state of resonance, the NTC can become short enough to violate mission constraints. The Spinning Slosh Test Rig (SSTR) is a forced-motion spin table where fluid dynamic effects in full-scale fuel tanks can be tested in order to obtain key parameters used to calculate the NTC. We accomplish this by independently varying nutation frequency versus the spin rate and measuring force and torque responses on the tank. This method was used to predict parameters for the Genesis, Contour, and Stereo missions, whose tanks were mounted outboard from the spin axis. These parameters are incorporated into a mathematical model that uses mechanical analogs, such as pendulums and rotors, to simulate the force and torque resonances associated with fluid slosh.

  2. ABSTRACTS WELDEL PIPE AND TUBE

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    ABSTRACTS WELDEL PIPE AND TUBE Vol.24 No.3 May.2001 Huang Jingan(1) Strengthen, Intercourse, Coordination and Promote the Development Together Liang Aiyu(11) The Production and the Development of the Water supply pipe for City Construction From the aspects of the quality, appearance, environment protection, economic analysis etc., This article evaluates the galvanized pipe, plastic steel complex pipe, plastic aluminum pipe, stainless pipe for city water supply. In accordance with the requirements of the city construction programming and development, it is considered that the plastic aluminum pipe and plastic steel pipe instead of galvanization pipe is the trend of the development. The author also gives some constructive proposals for reference. Subject Terms:galvanized pipe complex pipe stainless pipe city water supply evaluation Zhao Rongbin,Li Guangjun(14) The TIG welding of Protected Tantalum-pipe for sheathed thermocouples used in corrosive environment The protected Tantalum-pipe welding of sheathed therocouples was investigated by TIG. The welding process and its key parameters were introduced. Welding quality influenced by processing was discussed. Subject Terms:welding protected Tantalum-pipe corrosion He Defu et al(18) Design and Research for An Automatic MIG Welding Machine of Catalyst Converter of Automobile Two different schemes for automatic MIG welding of catalyst converter of automobile have been compared and analysed. A design of automatic MIG welding machine used for catalyst converter of automobile has been suggested in this paper. Subject Terms:environmental protection automobile tri-catalyst converter MIG welding automatic welding PLC Fang Chucai(24) Cold Crack Analysis of Low Alloy High Strength Steel Weld Seam Heat Affected Area During the welding of low alloy high strength (X65 and above), the fine crack occurs in the weld (especially inner weld) and the low plastic hard brickle structure occurs in the Heat Affected Area (HAZ) sometime. This

  3. Abstracts Of Some Papers

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    On the Rule of Soft law in Social Management Innovation Luo Haoeai Miao Zhijiang (P1) Abstract:With social transformation, traditional social management model based on system of units is becom- ing less effective. As a support system of social management, public law needs to response to social management in- novation. Under the guidance of the "relationship" perspective and the modern public law concept of equality, the social management model should transit from the unitary hard law governance towards composite hard/soft law gov- ernance. To promote social management innovation we should stick to the principle of people - oriented and making all consideration, on this basis, we need to initiate management philosophy of soft and interaction, carry out collaborative governance, implement self- discipline mechanism and mutual -mechanism, and to make use of all kinds of measures. Keywords:Social Management Innovation Public Law Rule of Soft Law

  4. Towards Abstract Interpretation of Epistemic Logic

    DEFF Research Database (Denmark)

    Ajspur, Mai; Gallagher, John Patrick

    in model-checking infinite systems, other approaches were developed based on approximating the model-checking algorithm so that it still terminates with some useful output. In this work we present a model-checking algorithm for a multiagent epistemic logic contain- ing operators for common and distributed......The model-checking problem is to decide, given a formula φ and an interpretation M, whether M satisfies φ, written M |= φ. Model-checking algorithms for temporal logics were initially developed with finite models (such as models of hardware) in mind so that M |= φ is decidable. As interest grew...... knowledge. The model-checker is developed as a function directly from the semantics of the logic, in a style that could be applied straight- forwardly to derive model-checkers for other logics. Secondly, we consider how to abstract the model-checker using abstract interpretation, yielding a procedure...

  5. ABSTRACTS OF SELECTED ARTICLES

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The Protection of the Intellectual Property Rights, the Ratio of the Independent R&D and the Technological Progress of Developing Countries The ratio, in the process of the technical advancement, of the independent R&D (IRD) to technology import reflects the basic model of the technical improvement of a country. By the introduction of the IRD and the technical intro- duction-the two ways of the basic technical advancement, we have, in this paper, expanded the model of the endoge- nous growth of the intermediate products, and thereby, discussed the decisive mechanism of the model of the technical improvement of developing countries, the effect of the ratio of IRD on the technical advancement and the economic growth, and the role of the protection of the intellectual property right (IPOTIPR) in it. The results of our study indi- cate that, on the way to the balanced growth, the increase in the radio of IRD in developing countries has the effect of positive drive on the technical advancement, that the increase in the IPOTIPR will, at the same time, reinforce the pro- tection for the achievement of the domestic R&D and shore up the protection for foreign patent,

  6. Norddesign 2012 - Book of Abstract

    DEFF Research Database (Denmark)

    has been organized in line with the original ideas. The topics mentioned in the call for abstracts were: Product Development: Integrated, Multidisciplinary, Product life oriented and Distributed. Multi-product Development. Innovation and Business Models. Engineering Design and Industrial Design......Welcome to NordDesign2012. This conference is the ninth in a row of biannual conferences organized by technical universities in the Nordic region. The first conference was held in Helsinki in 1996, and at this initial conference it was agreed to organize 10 conferences before deciding on the future....... Conceptualisation and Innovative thinking. Research approaches and topics: Human Behaviour and Cognition. Cooperation and Multidisciplinary Design. Staging and Management of Design. Communication in Design. Design education and teaching: Programmes and Syllabuses. New Courses. Integrated and Multi-disciplinary. We...

  7. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  8. Using abstract language signals power.

    Science.gov (United States)

    Wakslak, Cheryl J; Smith, Pamela K; Han, Albert

    2014-07-01

    Power can be gained through appearances: People who exhibit behavioral signals of power are often treated in a way that allows them to actually achieve such power (Ridgeway, Berger, & Smith, 1985; Smith & Galinsky, 2010). In the current article, we examine power signals within interpersonal communication, exploring whether use of concrete versus abstract language is seen as a signal of power. Because power activates abstraction (e.g., Smith & Trope, 2006), perceivers may expect higher power individuals to speak more abstractly and therefore will infer that speakers who use more abstract language have a higher degree of power. Across a variety of contexts and conversational subjects in 7 experiments, participants perceived respondents as more powerful when they used more abstract language (vs. more concrete language). Abstract language use appears to affect perceived power because it seems to reflect both a willingness to judge and a general style of abstract thinking.

  9. A second-generation device for automated training and quantitative behavior analyses of molecularly-tractable model organisms.

    Directory of Open Access Journals (Sweden)

    Douglas Blackiston

    Full Text Available A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays. The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science.

  10. ABSTRACTS OF MAJOR ARTICLES

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Insect Culture's Contribution to the China's Ancient Literature;The Agricultural Production of the Bohai Revolutionary Base Area;The Agriculture Development under the Context of Selective Transfer of Rural Labor Force -Based on the Model of Ranis - Fee;An Empirical Study on the Farmers" Cognition and Willingness of Farmland Property Rights -From Anhui Province Taihe County's Data of Country Survey;Analysis of the New Migrant Workers Situation of the Spirit and Cultural Life -Based on the Comparison with the Older Generation of Migrant Workers and the Urban Youth

  11. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  12. A Practical Guide to Calibration of a GSSHA Hydrologic Model Using ERDC Automated Model Calibration Software - Efficient Local Search

    Science.gov (United States)

    2012-02-01

    input files. For this particular example, the GSSHA model input files with extensions “cmt” (for mapping table file) and “ cif ” (for channel input...the executable PAR2PAR (Doherty, 2004) will result in an update of the GSSHA model input files with extensions “cmt” and “ cif ” with the adjustable

  13. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  14. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  15. Contents & Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Changes in Terms of Trade, Import Tariff Reductions and Productivity Effects of China's WTO Accession This paper uses the data from 2002 to 2009 after China's WTO accession to build a panel data model to test the effects of changes in terms of trade and import tariff reductions on output and labor productivity. Our test results show that the price terms of trade improvement has contributed significantly to productivity growth, the effects of price terms of trade on productivity is manifested in a structure and resource allocation effects' synthesis. In the case the overall structural effects and the resources alloca- tion effects proves positive, the changes in price terms of trade can lead to the positive productivity effects.

  16. Formal Methods for Automated Diagnosis of Autosub 6000

    Science.gov (United States)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  17. Expanded abstracts with biographies. 1989 technical program

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    Extended abstracts of papers are presented under the headings: computing; crustal studies; economic exploration; electromagnetics; engineering and ground water; geology; gravity and magnetics; mining; poster papers; recent advances and the road ahead; rock physics; exploration applications of geologic modelling; seismic acquisition; seismic interpretation; seismic inversion; seismic modelling; seismic processing; student paper competition; and special SEG workshops.

  18. Reading Guided by Automated Graphical Representations: How Model-Based Text Visualizations Facilitate Learning in Reading Comprehension Tasks

    Science.gov (United States)

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk

    2011-01-01

    Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which has similarities to a concept map. The "text to graph" feature of the software is based on…

  19. Automated Semiconductor Modeling.

    Science.gov (United States)

    1987-05-01

    where Yc= A + Bxo is the calculated ordinate for the value of x = xo . The values of the constants, A and B, for which this expression is a minimum may be...y)2 = 0 A o o 2 ,.(A+ Bxo - Yo) = 0 (A + fto - yo) = 0 o 0 For B: (A + Bx y 0 2 (Ax + Bx2 -xY) = 0 1~ 0 0 0 0 (Ax + Bx - xoy o ) = 0 0 0 0 0 These two...Systems Directorate, Picatinny Arsenal, Dover, NJ. 39 |.VV "VVX."’. - .-’.’ " , .e Yo A+ , Bxo xoyo ~An0 + L x Factoring A and B from the terms gives YO=N

  20. Modeling and Analysis on Braking Hydraulic System for LHD Based on Automation Studio%基于Automation Studio地下铲运机制动液压系统建模分析

    Institute of Scientific and Technical Information of China (English)

    张楠; 韩飞

    2016-01-01

    该文以CY-6型地下铲运机的重要单元之一制动液压系统为研究对象,该系统采用的是SAHR型制动器系统,其为全液压单回路式的制动形式,分析其工作原理,并对动作执行机构及所受载荷进行研究,在此基础之上,应用Automation Studio仿真分析软件搭建整机制动液压系统的仿真分析模型,对其动态工作过程进行仿真分析,获取了系统的充液过程和制动过程的性能曲线和相关参数,为实现该种车辆制动液压系统的优化设计获得理论参考和技术支撑,具有一定的工程应用价值。%In this paper, CY-6-type LHD braking hydraulic system as the research object, the system uses the SAHR type braking system, which is single-loop hydraulic braking system. Analyze the working principle of the system, and the action actuator and suffered payload. On this basis, the application of Automation Studio simulation software to build a model of braking hydraulic system. Simulation dynamic working process. And get the performance curve of filling process and the braking system and related parameters. To achieve optimal design this kind of vehicle hydraulic steering system to obtain a theoretical reference and technical support, with some engineering value.