WorldWideScience

Sample records for machine architecture inference

  1. Future database machine architectures

    OpenAIRE

    Hsiao, David K.

    1984-01-01

    There are many software database management systems available on many general-purpose computers ranging from micros to super-mainframes. Database machines as backened computers can offload the database management work from the mainframe so that we can retain the same mainframe longer. However, the database backend must also demonstrate lower cost, higher performance, and newer functionality. Some of the fundamental architecture issues in the design of high-performance and great-capacity datab...

  2. Two General Architectures for Intelligent Machine Performance Degradation Assessment

    Directory of Open Access Journals (Sweden)

    Yanwei Xu

    2015-01-01

    Full Text Available Markov model is of good ability to infer random events whose likelihood depends on previous events. Based on this theory, hidden Markov model serves as an extension of Markov model to present an event from observations rather than states in Markov model. Moreover, due to successful applications in speech recognition, it attracts much attention in machine fault diagnosis. This paper presents two architectures for machine performance degradation assessment, which can be used to minimize machine downtime, reduce economic loss, and improve productivity. The major difference between the two architectures is whether historical data are available to build hidden Markov models. In case studies, bearing data as well as available historical data are used to demonstrate the effectiveness of the first architecture. Then, whole life gearbox data without historical data are employed to demonstrate the effectiveness of the second architecture. The results obtained from two case studies show that the presented architectures have good abilities for machine performance degradation assessment.

  3. Machine-to-machine communications architectures, technology, standards, and applications

    CERN Document Server

    Misic, Vojislav B

    2014-01-01

    With the number of machine-to-machine (M2M)-enabled devices projected to reach 20 to 50 billion by 2020, there is a critical need to understand the demands imposed by such systems. Machine-to-Machine Communications: Architectures, Technology, Standards, and Applications offers rigorous treatment of the many facets of M2M communication, including its integration with current technology.Presenting the work of a different group of international experts in each chapter, the book begins by supplying an overview of M2M technology. It considers proposed standards, cutting-edge applications, architectures, and traffic modeling and includes case studies that highlight the differences between traditional and M2M communications technology.Details a practical scheme for the forward error correction code designInvestigates the effectiveness of the IEEE 802.15.4 low data rate wireless personal area network standard for use in M2M communicationsIdentifies algorithms that will ensure functionality, performance, reliability, ...

  4. Reversible machine code and its abstract processor architecture

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert; Yokoyama, Tetsuo

    2007-01-01

    A reversible abstract machine architecture and its reversible machine code are presented and formalized. For machine code to be reversible, both the underlying control logic and each instruction must be reversible. A general class of machine instruction sets was proven to be reversible, building ...... on our concept of reversible updates. The presentation is abstract and can serve as a guideline for a family of reversible processor designs. By example, we illustrate programming principles for the abstract machine architecture formalized in this paper....

  5. Bayesian inference analyses of the polygenic architecture of rheumatoid arthritis

    NARCIS (Netherlands)

    Stahl, Eli A.; Wegmann, Daniel; Trynka, Gosia; Gutierrez-Achury, Javier; Do, Ron; Voight, Benjamin F.; Kraft, Peter; Chen, Robert; Kallberg, Henrik J.; Kurreeman, Fina A. S.; Kathiresan, Sekar; Wijmenga, Cisca; Gregersen, Peter K.; Alfredsson, Lars; Siminovitch, Katherine A.; Worthington, Jane; de Bakker, Paul I. W.; Raychaudhuri, Soumya; Plenge, Robert M.

    2012-01-01

    The genetic architectures of common, complex diseases are largely uncharacterized. We modeled the genetic architecture underlying genome-wide association study (GWAS) data for rheumatoid arthritis and developed a new method using polygenic risk-score analyses to infer the total liability-scale varia

  6. Mood Inference Machine: Framework to Infer Affective Phenomena in ROODA Virtual Learning Environment

    Directory of Open Access Journals (Sweden)

    Magalí Teresinha Longhi

    2012-02-01

    Full Text Available This article presents a mechanism to infer mood states, aiming to provide virtual learning environments (VLEs with a tool able to recognize the student’s motivation. The inference model has as its parameters personality traits, motivational factors obtained through behavioral standards and the affective subjectivity identified in texts made available in the communication functionalities of the VLE. In the inference machine, such variables are treated under probability reasoning, more precisely by Bayesian networks.

  7. Reversible machine code and its abstract processor architecture

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert; Yokoyama, Tetsuo

    2007-01-01

    A reversible abstract machine architecture and its reversible machine code are presented and formalized. For machine code to be reversible, both the underlying control logic and each instruction must be reversible. A general class of machine instruction sets was proven to be reversible, building...

  8. FRC Separatrix inference using machine-learning techniques

    Science.gov (United States)

    Romero, Jesus; Roche, Thomas; the TAE Team

    2016-10-01

    As Field Reversed Configuration (FRC) devices approach lifetimes exceeding the characteristic time of conductive structures external to the plasma, plasma stabilization cannot be achieved solely by the flux conserving effect of the external structures, and active control systems are then necessary. An essential component of such control systems is a reconstruction method for the plasma separatrix suitable for real time. We report on a method to infer the separatrix in an FRC using the information of magnetic probes located externally to the plasma. The method uses machine learning methods, namely Bayesian inference of Gaussian Processes, to obtain the most likely plasma current density distribution given the measurements of magnetic field external to the plasma. From the current sources, flux function and in particular separatrix are easily computed. The reconstruction method is non iterative and hence suitable for deterministic real time applications. Validation results with numerical simulations and application to separatrix inference of C-2U plasma discharges will be presented.

  9. Exploration of the Adaptive Neuro - Fuzzy Inference System Architecture and its Applications

    Directory of Open Access Journals (Sweden)

    Okereke Eze Aru

    2016-09-01

    Full Text Available In this paper we exhibited an architecture and essential learning process basic in fuzzy inference system and adaptive neuro fuzzy inference system which is a hybrid network implemented in framework of adaptive network. In genuine figuring environment, soft computing techniques including neural network, fuzzy logic algorithms have been generally used to infer a real choice utilizing given input or output information traits, ANFIS can build mapping taking into account both human learning and hybrid algorithms. This study includes investigation of ANFIS methodology. ANFIS procedure is utilized to display nonlinear functions, to control a standout amongst the most essential parameters of the impelling machine and anticipate a turbulent time arrangement, all yielding more viable, quicker result.

  10. Logical Evaluation of Consciousness: For Incorporating Consciousness into Machine Architecture

    OpenAIRE

    2010-01-01

    Machine Consciousness is the study of consciousness in a biological, philosophical, mathematical and physical perspective and designing a model that can fit into a programmable system architecture. Prime objective of the study is to make the system architecture behave consciously like a biological model does. Present work has developed a feasible definition of consciousness, that characterizes consciousness with four parameters i.e., parasitic, symbiotic, self referral and reproduction. Prese...

  11. Architecture of explanatory inference in the human prefrontal cortex.

    Science.gov (United States)

    Barbey, Aron K; Patterson, Richard

    2011-01-01

    Causal reasoning is a ubiquitous feature of human cognition. We continuously seek to understand, at least implicitly and often explicitly, the causal scenarios in which we live, so that we may anticipate what will come next, plan a potential response and envision its outcome, decide among possible courses of action in light of their probable outcomes, make midstream adjustments in our goal-related activities as our situation changes, and so on. A considerable body of research shows that the lateral prefrontal cortex (PFC) is crucial for causal reasoning, but also that there are significant differences in the manner in which ventrolateral PFC, dorsolateral PFC, and anterolateral PFC support causal reasoning. We propose, on the basis of research on the evolution, architecture, and functional organization of the lateral PFC, a general framework for understanding its roles in the many and varied sorts of causal reasoning carried out by human beings. Specifically, the ventrolateral PFC supports the generation of basic causal explanations and inferences; dorsolateral PFC supports the evaluation of these scenarios in light of some given normative standard (e.g., of plausibility or correctness in light of real or imagined causal interventions); and anterolateral PFC supports explanation and inference at an even higher level of complexity, coordinating the processes of generation and evaluation with further cognitive processes, and especially with computations of hedonic value and emotional implications of possible behavioral scenarios - considerations that are often critical both for understanding situations causally and for deciding about our own courses of action.

  12. Logical Evaluation of Consciousness: For Incorporating Consciousness into Machine Architecture

    CERN Document Server

    Padhy, C N

    2010-01-01

    Machine Consciousness is the study of consciousness in a biological, philosophical, mathematical and physical perspective and designing a model that can fit into a programmable system architecture. Prime objective of the study is to make the system architecture behave consciously like a biological model does. Present work has developed a feasible definition of consciousness, that characterizes consciousness with four parameters i.e., parasitic, symbiotic, self referral and reproduction. Present work has also developed a biologically inspired consciousness architecture that has following layers: quantum layer, cellular layer, organ layer and behavioral layer and traced the characteristics of consciousness at each layer. Finally, the work has estimated physical and algorithmic architecture to devise a system that can behave consciously.

  13. Software architecture for time-constrained machine vision applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  14. Digital VLSI algorithms and architectures for support vector machines.

    Science.gov (United States)

    Anguita, D; Boni, A; Ridella, S

    2000-06-01

    In this paper, we propose some very simple algorithms and architectures for a digital VLSI implementation of Support Vector Machines. We discuss the main aspects concerning the realization of the learning phase of SVMs, with special attention on the effects of fixed-point math for computing and storing the parameters of the network. Some experiments on two classification problems are described that show the efficiency of the proposed methods in reaching optimal solutions with reasonable hardware requirements.

  15. Neural architecture design based on extreme learning machine.

    Science.gov (United States)

    Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis

    2013-12-01

    Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Adaptive compensation of sculptured surface machining errors by open architecture manufacturing system

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the adaptive compensation of sculptured surfacemachining errors by using the open architecture intelligent manufacturing system to ensure real-time high-precision machining of sculptured surface, and the tool deflection model constructed for prediction of machining errors to be compensated and analysis of the effect of tool deflection on machining errors, and concludes from experimental results that the open architecture intelligent manufacturing system can effectively improve the machining precision and reduce the machining errors by 30%.

  17. The Specification of a Data Base Machine Architecture Development Facility and a Methodology for Developing Special Purpose Function Architectures,

    Science.gov (United States)

    1980-07-01

    LIUZ ZII UNCLASSIF71ED RAUC -TR-80-26 N RADC-TR40-256 In-Moure Report July 1980 THE SPECIFICATION OF A DATA BASE MACHINE ARCHITECTURE DEVELOPMENT...ROME AIR DEVELOPME14T CENTER GRIFFISS AFB NY F/6 9/2 THE SPECIFICATION OF A DATA BASE MACHINE ARCHITECTURE DEVELOPME--ETC(U) r UNCLASSIFIED RAUC -TB BR

  18. Biologically relevant neural network architectures for support vector machines.

    Science.gov (United States)

    Jändel, Magnus

    2014-01-01

    Neural network architectures that implement support vector machines (SVM) are investigated for the purpose of modeling perceptual one-shot learning in biological organisms. A family of SVM algorithms including variants of maximum margin, 1-norm, 2-norm and ν-SVM is considered. SVM training rules adapted for neural computation are derived. It is found that competitive queuing memory (CQM) is ideal for storing and retrieving support vectors. Several different CQM-based neural architectures are examined for each SVM algorithm. Although most of the sixty-four scanned architectures are unconvincing for biological modeling four feasible candidates are found. The seemingly complex learning rule of a full ν-SVM implementation finds a particularly simple and natural implementation in bisymmetric architectures. Since CQM-like neural structures are thought to encode skilled action sequences and bisymmetry is ubiquitous in motor systems it is speculated that trainable pattern recognition in low-level perception has evolved as an internalized motor programme. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Adaptive Cluster Expansion for Inferring Boltzmann Machines with Noisy Data

    CERN Document Server

    Cocco, Simona

    2011-01-01

    We introduce a procedure to infer the interactions among a set of binary variables, based on their sampled frequencies and pairwise correlations. The algorithm builds the clusters of variables contributing most to the entropy of the inferred Ising model, and rejects the small contributions due to the sampling noise. Our procedure successfully recovers benchmark Ising models even at criticality and in the low temperature phase, and is applied to neurobiological data.

  20. Open architecture controller solution for custom machine systems

    Science.gov (United States)

    Anderson, Ronald L.; Reagin, J. M.; Garner, T. D.; Sweeny, T. E.

    1997-01-01

    In today's marketplace, product quality and price have become requirements for entry and are no longer sufficient to differentiate one's product and gain a competitive advantage. A key to competition in the future will be a company's ability to respond quickly to a rapidly-changing global marketplace. Developers of manufacturing equipment must play a role in the reduction of the product development cycle time by increasing the flexibility of their equipment and decreasing its cost and time to market. This paper will discuss the implementation of an open-architecture machine controller on a flip-chip placement machine and how this implementation supports the goals of reduced development time and increased equipment flexibility. The following subjects are discussed: 1) Issues related to the selection of a standard operating system, including real-time performance, preemptive multi-tasking, multi-threaded applications, and development tools. 2) The use of a common API for motion, and I/O. 3) Use of a rapid application development and object-oriented programming techniques on the machine controller to shorten development time and support code reuse. 4) Specific hardware and software issues related to the implementation of the flip chip controller. This includes hardware and software implementation details, controller performance, and human interface issues.

  1. Intelligent machines in the twenty-first century: foundations of inference and inquiry

    Science.gov (United States)

    Knuth, Kevin H.

    2003-01-01

    The last century saw the application of Boolean algebra to the construction of computing machines, which work by applying logical transformations to information contained in their memory. The development of information theory and the generalization of Boolean algebra to Bayesian inference have enabled these computing machines, in the last quarter of the twentieth century, to be endowed with the ability to learn by making inferences from data. This revolution is just beginning as new computational techniques continue to make difficult problems more accessible. Recent advances in our understanding of the foundations of probability theory have revealed implications for areas other than logic. Of relevance to intelligent machines, we recently identified the algebra of questions as the free distributive algebra, which will now allow us to work with questions in a way analogous to that which Boolean algebra enables us to work with logical statements. In this paper, we examine the foundations of inference and inquiry. We begin with a history of inferential reasoning, highlighting key concepts that have led to the automation of inference in modern machine-learning systems. We then discuss the foundations of inference in more detail using a modern viewpoint that relies on the mathematics of partially ordered sets and the scaffolding of lattice theory. This new viewpoint allows us to develop the logic of inquiry and introduce a measure describing the relevance of a proposed question to an unresolved issue. Last, we will demonstrate the automation of inference, and discuss how this new logic of inquiry will enable intelligent machines to ask questions. Automation of both inference and inquiry promises to allow robots to perform science in the far reaches of our solar system and in other star systems by enabling them not only to make inferences from data, but also to decide which question to ask, which experiment to perform, or which measurement to take given what they have

  2. Intelligent machines in the twenty-first century: foundations of inference and inquiry

    Science.gov (United States)

    Knuth, Kevin H.

    2003-01-01

    The last century saw the application of Boolean algebra to the construction of computing machines, which work by applying logical transformations to information contained in their memory. The development of information theory and the generalization of Boolean algebra to Bayesian inference have enabled these computing machines, in the last quarter of the twentieth century, to be endowed with the ability to learn by making inferences from data. This revolution is just beginning as new computational techniques continue to make difficult problems more accessible. Recent advances in our understanding of the foundations of probability theory have revealed implications for areas other than logic. Of relevance to intelligent machines, we recently identified the algebra of questions as the free distributive algebra, which will now allow us to work with questions in a way analogous to that which Boolean algebra enables us to work with logical statements. In this paper, we examine the foundations of inference and inquiry. We begin with a history of inferential reasoning, highlighting key concepts that have led to the automation of inference in modern machine-learning systems. We then discuss the foundations of inference in more detail using a modern viewpoint that relies on the mathematics of partially ordered sets and the scaffolding of lattice theory. This new viewpoint allows us to develop the logic of inquiry and introduce a measure describing the relevance of a proposed question to an unresolved issue. Last, we will demonstrate the automation of inference, and discuss how this new logic of inquiry will enable intelligent machines to ask questions. Automation of both inference and inquiry promises to allow robots to perform science in the far reaches of our solar system and in other star systems by enabling them not only to make inferences from data, but also to decide which question to ask, which experiment to perform, or which measurement to take given what they have

  3. Modular particle filtering FPGA hardware architecture for brain machine interfaces.

    Science.gov (United States)

    Mountney, John; Obeid, Iyad; Silage, Dennis

    2011-01-01

    As the computational complexities of neural decoding algorithms for brain machine interfaces (BMI) increase, their implementation through sequential processors becomes prohibitive for real-time applications. This work presents the field programmable gate array (FPGA) as an alternative to sequential processors for BMIs. The reprogrammable hardware architecture of the FPGA provides a near optimal platform for performing parallel computations in real-time. The scalability and reconfigurability of the FPGA accommodates diverse sets of neural ensembles and a variety of decoding algorithms. Throughput is significantly increased by decomposing computations into independent parallel hardware modules on the FPGA. This increase in throughput is demonstrated through a parallel hardware implementation of the auxiliary particle filtering signal processing algorithm.

  4. Integrated quality control architecture for multistage machining processes

    Science.gov (United States)

    Yang, Jie; Liu, Guixiong

    2010-12-01

    To solve problems concerning the process quality prediction control for the multistage machining processes, a integrated quality control architecture is proposed in this paper. First, a hierarchical multiple criteria decision model is established for the key process and the weight matrix method stratified is discussed. Predictive control of the manufacturing quality is not just for on-site monitoring and control layer, control layer in the enterprise, remote monitoring level of quality exists a variety of target predictive control demand, therefore, based on XML to achieve a unified description of manufacturing quality information, and in different source of quality information between agencies to achieve the transfer and sharing. This will predict complex global quality control, analysis and diagnosis data to lay a good foundation to achieve a more practical, open and standardized manufacturing quality with higher levels of information integration system.

  5. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    Science.gov (United States)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  6. A Photometric Machine-Learning Method to Infer Stellar Metallicity

    Science.gov (United States)

    Miller, Adam A.

    2015-01-01

    Following its formation, a star's metal content is one of the few factors that can significantly alter its evolution. Measurements of stellar metallicity ([Fe/H]) typically require a spectrum, but spectroscopic surveys are limited to a few x 10(exp 6) targets; photometric surveys, on the other hand, have detected > 10(exp 9) stars. I present a new machine-learning method to predict [Fe/H] from photometric colors measured by the Sloan Digital Sky Survey (SDSS). The training set consists of approx. 120,000 stars with SDSS photometry and reliable [Fe/H] measurements from the SEGUE Stellar Parameters Pipeline (SSPP). For bright stars (g' machine-learning method is similar to the scatter in [Fe/H] measurements from low-resolution spectra..

  7. Intelligent Machines in the 21st Century: Automating the Processes of Inference and Inquiry

    Science.gov (United States)

    Knuth, Kevin H.

    2003-01-01

    The last century saw the application of Boolean algebra toward the construction of computing machines, which work by applying logical transformations to information contained in their memory. The development of information theory and the generalization of Boolean algebra to Bayesian inference have enabled these computing machines. in the last quarter of the twentieth century, to be endowed with the ability to learn by making inferences from data. This revolution is just beginning as new computational techniques continue to make difficult problems more accessible. However, modern intelligent machines work by inferring knowledge using only their pre-programmed prior knowledge and the data provided. They lack the ability to ask questions, or request data that would aid their inferences. Recent advances in understanding the foundations of probability theory have revealed implications for areas other than logic. Of relevance to intelligent machines, we identified the algebra of questions as the free distributive algebra, which now allows us to work with questions in a way analogous to that which Boolean algebra enables us to work with logical statements. In this paper we describe this logic of inference and inquiry using the mathematics of partially ordered sets and the scaffolding of lattice theory, discuss the far-reaching implications of the methodology, and demonstrate its application with current examples in machine learning. Automation of both inference and inquiry promises to allow robots to perform science in the far reaches of our solar system and in other star systems by enabling them to not only make inferences from data, but also decide which question to ask, experiment to perform, or measurement to take given what they have learned and what they are designed to understand.

  8. Diverse Effects, Complex Causes: Children Use Information About Machines' Functional Diversity to Infer Internal Complexity.

    Science.gov (United States)

    Ahl, Richard E; Keil, Frank C

    2016-09-26

    Four studies explored the abilities of 80 adults and 180 children (4-9 years), from predominantly middle-class families in the Northeastern United States, to use information about machines' observable functional capacities to infer their internal, "hidden" mechanistic complexity. Children as young as 4 and 5 years old used machines' numbers of functions as indications of complexity and matched machines performing more functions with more complex "insides" (Study 1). However, only older children (6 and older) and adults used machines' functional diversity alone as an indication of complexity (Studies 2-4). The ability to use functional diversity as a complexity cue therefore emerges during the early school years, well before the use of diversity in most categorical induction tasks.

  9. Model-driven Migration of Supervisory Machine Control Architectures

    NARCIS (Netherlands)

    Graaf, B.; Weber, S.; Van Deursen, A.

    2006-01-01

    Supervisory machine control is the high-level control in advanced manufacturing machines that is responsible for the coordination of manufacturing activities. Traditionally, the design of such control systems is based on finite state machines. An alternative, more flexible approach is based on

  10. L'orthoglide : une machine-outil rapide d'architecture parall\\`ele isotrope

    CERN Document Server

    Wenger, Philippe; Majou, Félix

    2007-01-01

    This article presents the Orthoglide project. The purpose of this project is the realization of a prototype of machine tool to three degrees of translation. The characteristic of this machine is a parallel kinematic architecture optimized to obtain a compact workspace with homogeneous performance. For that, the principal criterion of design which was used is the isotropy.

  11. Using Multiple FPGA Architectures for Real-time Processing of Low-level Machine Vision Functions

    Science.gov (United States)

    Thomas H. Drayer; William E. King; Philip A. Araman; Joseph G. Tront; Richard W. Conners

    1995-01-01

    In this paper, we investigate the use of multiple Field Programmable Gate Array (FPGA) architectures for real-time machine vision processing. The use of FPGAs for low-level processing represents an excellent tradeoff between software and special purpose hardware implementations. A library of modules that implement common low-level machine vision operations is presented...

  12. From scientific instrument to industrial machine : Coping with architectural stress in embedded systems

    NARCIS (Netherlands)

    Doornbos, R.; Loo, S. van

    2012-01-01

    Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientific instrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a

  13. A mental architecture modeling of inference of sensory stimuli to the teaching of the deaf

    Directory of Open Access Journals (Sweden)

    Rubens Santos Guimaraes

    2016-10-01

    Full Text Available The transmission and retention of knowledge rests on the cognitive faculty of the concepts linked to it. The repeatability of your applications builds a solid foundation for Education, according to cultural and behavioral standards set by the Society. This cognitive ability to infer on what we observe and perceive, regarded as intrinsic human beings, independent of their physical capacity. This article presents a conceptual model Mental Architecture Digitized – AMD, enabling reproduce inferences about sensory stimuli deaf, focused on the implementation of a web system that aims to improve the teaching and learning of students with hearing disability. In this proposal, we evaluate the contextual aspects experienced by learners during the interactions between the constituent elements of a study session, based on experiments with two deaf students enrolled in regular high school. The results allow us to infer the potential of a computer communications environment to expand the possibilities of social inclusion of these students.

  14. Root architecture simulation improves the inference from seedling root phenotyping towards mature root systems.

    Science.gov (United States)

    Zhao, Jiangsan; Bodner, Gernot; Rewald, Boris; Leitner, Daniel; Nagel, Kerstin A; Nakhforoosh, Alireza

    2017-02-01

    Root phenotyping provides trait information for plant breeding. A shortcoming of high-throughput root phenotyping is the limitation to seedling plants and failure to make inferences on mature root systems. We suggest root system architecture (RSA) models to predict mature root traits and overcome the inference problem. Sixteen pea genotypes were phenotyped in (i) seedling (Petri dishes) and (ii) mature (sand-filled columns) root phenotyping platforms. The RSA model RootBox was parameterized with seedling traits to simulate the fully developed root systems. Measured and modelled root length, first-order lateral number, and root distribution were compared to determine key traits for model-based prediction. No direct relationship in root traits (tap, lateral length, interbranch distance) was evident between phenotyping systems. RootBox significantly improved the inference over phenotyping platforms. Seedling plant tap and lateral root elongation rates and interbranch distance were sufficient model parameters to predict genotype ranking in total root length with an RSpearman of 0.83. Parameterization including uneven lateral spacing via a scaling function substantially improved the prediction of architectures underlying the differently sized root systems. We conclude that RSA models can solve the inference problem of seedling root phenotyping. RSA models should be included in the phenotyping pipeline to provide reliable information on mature root systems to breeding research. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  15. NC flame pipe cutting machine tool based on open architecture CNC system

    Institute of Scientific and Technical Information of China (English)

    Xiaogen NIE; Yanbing LIU

    2009-01-01

    Based on the analysis of the principle and flame movement of a pipe cutting machine tool, a retrofit NC flame pipe cutting machine tool (NFPCM) that can meet the demands of cutting various pipes is proposed. The paper deals with the design and implementation of an open architecture CNC system for the NFPCM, many of whose aspects are similar to milling machines; however, different from their machining processes and control strategies. The paper emphasizes on the NC system structure and the method for directly creating the NC file according to the cutting type and parameters. Further, the paper develops the program and sets up the open and module NC system.

  16. Modelling of internal architecture of kinesin nanomotor as a machine language.

    Science.gov (United States)

    Khataee, H R; Ibrahim, M Y

    2012-09-01

    Kinesin is a protein-based natural nanomotor that transports molecular cargoes within cells by walking along microtubules. Kinesin nanomotor is considered as a bio-nanoagent which is able to sense the cell through its sensors (i.e. its heads and tail), make the decision internally and perform actions on the cell through its actuator (i.e. its motor domain). The study maps the agent-based architectural model of internal decision-making process of kinesin nanomotor to a machine language using an automata algorithm. The applied automata algorithm receives the internal agent-based architectural model of kinesin nanomotor as a deterministic finite automaton (DFA) model and generates a regular machine language. The generated regular machine language was acceptable by the architectural DFA model of the nanomotor and also in good agreement with its natural behaviour. The internal agent-based architectural model of kinesin nanomotor indicates the degree of autonomy and intelligence of the nanomotor interactions with its cell. Thus, our developed regular machine language can model the degree of autonomy and intelligence of kinesin nanomotor interactions with its cell as a language. Modelling of internal architectures of autonomous and intelligent bio-nanosystems as machine languages can lay the foundation towards the concept of bio-nanoswarms and next phases of the bio-nanorobotic systems development.

  17. Modular reconfigurable machines incorporating modular open architecture control

    CSIR Research Space (South Africa)

    Padayachee, J

    2008-01-01

    Full Text Available degrees of freedom on a single platform. A corresponding modular Open Architecture Control (OAC) system is presented. OAC overcomes the inflexibility of fixed proprietary automation, ensuring that MRMs provide the reconfigurability and extensibility...

  18. Design of a real-time open architecture controller for reconfigurable machine tool

    CSIR Research Space (South Africa)

    Masekamela, I

    2008-06-01

    Full Text Available modular structure in form of modular machines and open architecture controllers that can quickly change the physical structure and appropriately adjust the control system to adapt to the new production requirements. The paper aims to present the design...

  19. Inferring the location of buried UXO using a support vector machine

    Science.gov (United States)

    Fernández, Juan Pablo; Sun, Keli; Barrowes, Benjamin; O'Neill, Kevin; Shamatava, Irma; Shubitidze, Fridon; Paulsen, Keith D.

    2007-04-01

    The identification of unexploded ordnance (UXO) using electromagnetic-induction (EMI) sensors involves two essentially independent steps: Each anomaly detected by the sensor has to be located fairly accurately, and its orientation determined, before one can try to find size/shape/composition properties that identify the object uniquely. The dependence on the latter parameters is linear, and can be solved for efficiently using for example the Normalized Surface Magnetic Charge model. The location and orientation, on the other hand, have a nonlinear effect on the measurable scattered field, making their determination much more time-consuming and thus hampering the ability to carry out discrimination in real time. In particular, it is difficult to resolve for depth when one has measurements taken at only one instrument elevation. In view of the difficulties posed by direct inversion, we propose using a Support Vector Machine (SVM) to infer the location and orientation of buried UXO. SVMs are a method of supervised machine learning: the user can train a computer program by feeding it features of representative examples, and the machine, in turn, can generalize this information by finding underlying patterns and using them to classify or regress unseen instances. In this work we train an SVM using measured-field information, for both synthetic and experimental data, and evaluate its ability to predict the location of different buried objects to reasonable accuracy. We explore various combinations of input data and learning parameters in search of an optimal predictive configuration.

  20. Time-triggered State-machine Reliable Software Architecture for Micro Turbine Engine Control

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qi; XU Guoqiang; DING Shuiting

    2012-01-01

    Time-triggered (TT) embedded software pattern is well accepted in aerospace industry for its high reliability.Finite-state-machine (FSM) design method is widely used for its high efficiency and predictable behavior.In this paper,the time-triggered and state-machine combination software architecture is implemented for a 25 kg thrust micro turbine engine (MTE) used for unmanned aerial vehicle (UAV) system; also model-based-design development workflow for airworthiness software directive DO-178B is utilized.Experimental results show that time-triggered state-machine software architecture and development method could shorten the system development time,reduce the system test cost and make the turbine engine easily comply with the airworthiness rules.

  1. Reveal, A General Reverse Engineering Algorithm for Inference of Genetic Network Architectures

    Science.gov (United States)

    Liang, Shoudan; Fuhrman, Stefanie; Somogyi, Roland

    1998-01-01

    Given the immanent gene expression mapping covering whole genomes during development, health and disease, we seek computational methods to maximize functional inference from such large data sets. Is it possible, in principle, to completely infer a complex regulatory network architecture from input/output patterns of its variables? We investigated this possibility using binary models of genetic networks. Trajectories, or state transition tables of Boolean nets, resemble time series of gene expression. By systematically analyzing the mutual information between input states and output states, one is able to infer the sets of input elements controlling each element or gene in the network. This process is unequivocal and exact for complete state transition tables. We implemented this REVerse Engineering ALgorithm (REVEAL) in a C program, and found the problem to be tractable within the conditions tested so far. For n = 50 (elements) and k = 3 (inputs per element), the analysis of incomplete state transition tables (100 state transition pairs out of a possible 10(exp 15)) reliably produced the original rule and wiring sets. While this study is limited to synchronous Boolean networks, the algorithm is generalizable to include multi-state models, essentially allowing direct application to realistic biological data sets. The ability to adequately solve the inverse problem may enable in-depth analysis of complex dynamic systems in biology and other fields.

  2. An Architecture for Hybrid Manufacturing Combining 3D Printing and CNC Machining

    Directory of Open Access Journals (Sweden)

    Marcel Müller

    2016-01-01

    Full Text Available Additive manufacturing is one of the key technologies of the 21st century. Additive manufacturing processes are often combined with subtractive manufacturing processes to create hybrid manufacturing because it is useful for manufacturing complex parts, for example, 3D printed sensor systems. Currently, several CNC machines are required for hybrid manufacturing: one machine is required for additive manufacturing and one is required for subtractive manufacturing. Disadvantages of conventional hybrid manufacturing methods are presented. Hybrid manufacturing with one CNC machine offers many advantages. It enables manufacturing of parts with higher accuracy, less production time, and lower costs. Using the example of fused layer modeling (FLM, we present a general approach for the integration of additive manufacturing processes into a numerical control for machine tools. The resulting CNC architecture is presented and its functionality is demonstrated. Its application is beyond the scope of this paper.

  3. Enhance the Performance of Virtual Machines by Using Cluster Computing Architecture

    Directory of Open Access Journals (Sweden)

    Chia-Ying Tseng

    2013-05-01

    Full Text Available Virtualization is a very important technology in the IaaS of the cloud computing. User uses computing resource as a virtual machine (VM provided from the system provider. The VM's performance is depended on physical machine. A VM should be deployed all required resources when it is created. If there is no more resource could be deployed, the VM should be move to another physical machine for getting higher performance by using VM's live migration. The overhead of a VM's live migration is 30 to 90 seconds. If there are many virtual machines which need live migration, the cost of overhead will be very much. This paper presents how to use cluster computing architecture to improve the VM's performance. It will enhance 15% of per-formance compared with VM's live migration.  

  4. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    .1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  5. Flexible architecture of data acquisition firmware based on multi-behaviors finite state machine

    Science.gov (United States)

    Arpaia, Pasquale; Cimmino, Pasquale

    2016-11-01

    A flexible firmware architecture for different kinds of data acquisition systems, ranging from high-precision bench instruments to low-cost wireless transducers networks, is presented. The key component is a multi-behaviors finite state machine, easily configurable to both low- and high-performance requirements, to diverse operating systems, as well as to on-line and batch measurement algorithms. The proposed solution was validated experimentally on three case studies with data acquisition architectures: (i) concentrated, in a high-precision instrument for magnetic measurements at CERN, (ii) decentralized, for telemedicine remote monitoring of patients at home, and (iii) distributed, for remote monitoring of building's energy loss.

  6. The Genetic Architecture of Quantitative Traits Cannot Be Inferred from Variance Component Analysis

    Science.gov (United States)

    Huang, Wen; Mackay, Trudy F. C.

    2016-01-01

    Classical quantitative genetic analyses estimate additive and non-additive genetic and environmental components of variance from phenotypes of related individuals without knowing the identities of quantitative trait loci (QTLs). Many studies have found a large proportion of quantitative trait variation can be attributed to the additive genetic variance (VA), providing the basis for claims that non-additive gene actions are unimportant. In this study, we show that arbitrarily defined parameterizations of genetic effects seemingly consistent with non-additive gene actions can also capture the majority of genetic variation. This reveals a logical flaw in using the relative magnitudes of variance components to indicate the relative importance of additive and non-additive gene actions. We discuss the implications and propose that variance component analyses should not be used to infer the genetic architecture of quantitative traits. PMID:27812106

  7. The Genetic Architecture of Quantitative Traits Cannot Be Inferred from Variance Component Analysis.

    Directory of Open Access Journals (Sweden)

    Wen Huang

    2016-11-01

    Full Text Available Classical quantitative genetic analyses estimate additive and non-additive genetic and environmental components of variance from phenotypes of related individuals without knowing the identities of quantitative trait loci (QTLs. Many studies have found a large proportion of quantitative trait variation can be attributed to the additive genetic variance (VA, providing the basis for claims that non-additive gene actions are unimportant. In this study, we show that arbitrarily defined parameterizations of genetic effects seemingly consistent with non-additive gene actions can also capture the majority of genetic variation. This reveals a logical flaw in using the relative magnitudes of variance components to indicate the relative importance of additive and non-additive gene actions. We discuss the implications and propose that variance component analyses should not be used to infer the genetic architecture of quantitative traits.

  8. From scientific instrument to industrial machine coping with architectural stress in embedded systems

    CERN Document Server

    Doornbos, Richard

    2012-01-01

    Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientific instrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a transmission electron microscope system built by FEI Company. Traditionally, transmission electron microscopes are manually operated scientific instruments, but they also have enormous potential for use in industrial applications. However, this new market has quite different characteristics. There are strong demands for cost-effective analysis, accurate and precise measurements, and ease-of-use. These demands can be translated into new system qualities, e.g. reliability, predictability and high throughput, as well as new functions, e.g. automation of electron microscopic analyses, automated focusing and positioning functions. From scientific instrument to industrial machine takes a pragmatic approach to the proble...

  9. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...

  10. A Machine Learning Method to Infer Fundamental Stellar Parameters from Photometric Light Curves

    CERN Document Server

    Miller, A A; Richards, J W; Lee, Y S; Starr, D L; Butler, N R; Tokarz, S; Smith, N; Eisner, J A

    2014-01-01

    A fundamental challenge for wide-field imaging surveys is obtaining follow-up spectroscopic observations: there are > $10^9$ photometrically cataloged sources, yet modern spectroscopic surveys are limited to ~few x $10^6$ targets. As we approach the Large Synoptic Survey Telescope (LSST) era, new algorithmic solutions are required to cope with the data deluge. Here we report the development of a machine-learning framework capable of inferring fundamental stellar parameters (Teff, log g, and [Fe/H]) using photometric-brightness variations and color alone. A training set is constructed from a systematic spectroscopic survey of variables with Hectospec/MMT. In sum, the training set includes ~9000 spectra, for which stellar parameters are measured using the SEGUE Stellar Parameters Pipeline (SSPP). We employed the random forest algorithm to perform a non-parametric regression that predicts Teff, log g, and [Fe/H] from photometric time-domain observations. Our final, optimized model produces a cross-validated root...

  11. Inferring compositional style in the neo-plastic paintings of Piet Mondrian by machine learning

    Science.gov (United States)

    Andrzejewski, David; Stork, David G.; Zhu, Xiaojin; Spronk, Ron

    2010-02-01

    We trained generative models and decision tree classifiers with positive and negative examples of the neo-plastic works of Piet Mondrian to infer his compositional principles, to generate "faux" works, and to explore the possibility of computer-based aids in authentication and attribution studies. Unlike previous computer work on this and other artists, we used "earlier state" works-intermediate versions of works created by Mondrian revealed through x-radiography and infra-red reflectography-when training our classifiers. Such intermediate state works provide a great deal of information to a classifier as they differ only slightly from the final works. We used methods from machine learning such as leave-one-out cross validation. Our decision tree classifier had accuracy of roughly 70% in recognizing the genuine works of Mondrian versus computer-generated replicas with similar statistical properties. Our trained classifier reveals implicit compositional principles underlying Mondrian's works, for instance the relative visual "weights" of the four colors (red, yellow, blue and black) he used in his rectangles. We used our trained generative model to generate "faux" Mondrians, which informally possess some of the compositional attributes of genuine works by this artist.

  12. The Use of Open Source Software for Open Architecture System on CNC Milling Machine

    Directory of Open Access Journals (Sweden)

    Dalmasius Ganjar Subagio

    2012-03-01

    Full Text Available Computer numerical control (CNC milling machine system cannot be separated from the software required to follow the provisions of the Open Architecture capabilities that have portability, extend ability, interoperability, and scalability. When a prescribed period of a CNC milling machine has passed and the manufacturer decided to discontinue it, then the user will have problems for maintaining the performance of the machine. This paper aims to show that the using of open source software (OSS is the way out to maintain engine performance. With the use of OSS, users no longer depend on the software built by the manufacturer because OSS is open and can be developed independently. In this paper, USBCNC V.3.42 is used as an alternative OSS. The test result shows that the work piece is in match with the desired pattern. The test result shows that the performance of machines using OSS has similar performance with the machine using software from the manufacturer. 

  13. Just Imagine! Learning to Emulate and Infer Actions with a Stochastic Generative Architecture

    Directory of Open Access Journals (Sweden)

    Fabian eSchrodt

    2016-03-01

    Full Text Available Theories on embodied cognition emphasize that our mind develops by processing and inferring structures given the encountered bodily experiences. Here we propose a distributed neural network architecture that learns a stochastic generative model from experiencing bodily actions. Our modular system learns from various manifolds of action perceptions in the form of (i relative positional motion of the individual body parts, (ii angular motion of joints, as well as (iii relatively stable top-down action identities. By Hebbian learning, this information is spatially segmented in separate neural modules that provide embodied state codes as well as temporal predictions of the state progression inside and across the modules. The network is generative in space and time, thus, being able to predict both, missing sensory information as well as next sensory information. We link the developing encodings to visuo-motor and multimodal representations that appear to be involved in action observation. Our results show that the system learns to infer action types as well as motor codes from partial sensory information by emulating observed actions with the own developing body model. We further evaluate the generative capabilities by showing that the system is able to generate internal imaginations of the learned types of actions without sensory stimulation, including visual images of the actions. The model highlights the important roles of motor cognition and embodied simulation for bootstrapping action understanding capabilities. We conclude that stochastic generative models appear very suitable for both, generating goal-directed actions, as well as predicting observed visuo-motor trajectories and action goals.

  14. RuLearn: an Open-source Toolkit for the Automatic Inference of Shallow-transfer Rules for Machine Translation

    Directory of Open Access Journals (Sweden)

    Sánchez-Cartagena Víctor M.

    2016-10-01

    Full Text Available This paper presents ruLearn, an open-source toolkit for the automatic inference of rules for shallow-transfer machine translation from scarce parallel corpora and morphological dictionaries. ruLearn will make rule-based machine translation a very appealing alternative for under-resourced language pairs because it avoids the need for human experts to handcraft transfer rules and requires, in contrast to statistical machine translation, a small amount of parallel corpora (a few hundred parallel sentences proved to be sufficient. The inference algorithm implemented by ruLearn has been recently published by the same authors in Computer Speech & Language (volume 32. It is able to produce rules whose translation quality is similar to that obtained by using hand-crafted rules. ruLearn generates rules that are ready for their use in the Apertium platform, although they can be easily adapted to other platforms. When the rules produced by ruLearn are used together with a hybridisation strategy for integrating linguistic resources from shallow-transfer rule-based machine translation into phrase-based statistical machine translation (published by the same authors in Journal of Artificial Intelligence Research, volume 55, they help to mitigate data sparseness. This paper also shows how to use ruLearn and describes its implementation.

  15. Bio-signal analysis system design with support vector machines based on cloud computing service architecture.

    Science.gov (United States)

    Shen, Chia-Ping; Chen, Wei-Hsin; Chen, Jia-Ming; Hsu, Kai-Ping; Lin, Jeng-Wei; Chiu, Ming-Jang; Chen, Chi-Huang; Lai, Feipei

    2010-01-01

    Today, many bio-signals such as Electroencephalography (EEG) are recorded in digital format. It is an emerging research area of analyzing these digital bio-signals to extract useful health information in biomedical engineering. In this paper, a bio-signal analyzing cloud computing architecture, called BACCA, is proposed. The system has been designed with the purpose of seamless integration into the National Taiwan University Health Information System. Based on the concept of. NET Service Oriented Architecture, the system integrates heterogeneous platforms, protocols, as well as applications. In this system, we add modern analytic functions such as approximated entropy and adaptive support vector machine (SVM). It is shown that the overall accuracy of EEG bio-signal analysis has increased to nearly 98% for different data sets, including open-source and clinical data sets.

  16. A MACHINE-LEARNING METHOD TO INFER FUNDAMENTAL STELLAR PARAMETERS FROM PHOTOMETRIC LIGHT CURVES

    Energy Technology Data Exchange (ETDEWEB)

    Miller, A. A. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, MS 169-506, Pasadena, CA 91109 (United States); Bloom, J. S.; Richards, J. W.; Starr, D. L. [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States); Lee, Y. S. [Department of Astronomy and Space Science, Chungnam National University, Daejeon 305-764 (Korea, Republic of); Butler, N. R. [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85281 (United States); Tokarz, S. [Smithsonian Astrophysical Observatory, Cambridge, MA 02138 (United States); Smith, N.; Eisner, J. A., E-mail: amiller@astro.caltech.edu [Steward Observatory, University of Arizona, Tucson, AZ 85721 (United States)

    2015-01-10

    A fundamental challenge for wide-field imaging surveys is obtaining follow-up spectroscopic observations: there are >10{sup 9} photometrically cataloged sources, yet modern spectroscopic surveys are limited to ∼few× 10{sup 6} targets. As we approach the Large Synoptic Survey Telescope era, new algorithmic solutions are required to cope with the data deluge. Here we report the development of a machine-learning framework capable of inferring fundamental stellar parameters (T {sub eff}, log g, and [Fe/H]) using photometric-brightness variations and color alone. A training set is constructed from a systematic spectroscopic survey of variables with Hectospec/Multi-Mirror Telescope. In sum, the training set includes ∼9000 spectra, for which stellar parameters are measured using the SEGUE Stellar Parameters Pipeline (SSPP). We employed the random forest algorithm to perform a non-parametric regression that predicts T {sub eff}, log g, and [Fe/H] from photometric time-domain observations. Our final optimized model produces a cross-validated rms error (RMSE) of 165 K, 0.39 dex, and 0.33 dex for T {sub eff}, log g, and [Fe/H], respectively. Examining the subset of sources for which the SSPP measurements are most reliable, the RMSE reduces to 125 K, 0.37 dex, and 0.27 dex, respectively, comparable to what is achievable via low-resolution spectroscopy. For variable stars this represents a ≈12%-20% improvement in RMSE relative to models trained with single-epoch photometric colors. As an application of our method, we estimate stellar parameters for ∼54,000 known variables. We argue that this method may convert photometric time-domain surveys into pseudo-spectrographic engines, enabling the construction of extremely detailed maps of the Milky Way, its structure, and history.

  17. Application of adaptive neuro-fuzzy inference system and cuckoo optimization algorithm for analyzing electro chemical machining process

    Science.gov (United States)

    Teimouri, Reza; Sohrabpoor, Hamed

    2013-12-01

    Electrochemical machining process (ECM) is increasing its importance due to some of the specific advantages which can be exploited during machining operation. The process offers several special privileges such as higher machining rate, better accuracy and control, and wider range of materials that can be machined. Contribution of too many predominate parameters in the process, makes its prediction and selection of optimal values really complex, especially while the process is programmized for machining of hard materials. In the present work in order to investigate effects of electrolyte concentration, electrolyte flow rate, applied voltage and feed rate on material removal rate (MRR) and surface roughness (SR) the adaptive neuro-fuzzy inference systems (ANFIS) have been used for creation predictive models based on experimental observations. Then the ANFIS 3D surfaces have been plotted for analyzing effects of process parameters on MRR and SR. Finally, the cuckoo optimization algorithm (COA) was used for selection solutions in which the process reaches maximum material removal rate and minimum surface roughness simultaneously. Results indicated that the ANFIS technique has superiority in modeling of MRR and SR with high prediction accuracy. Also, results obtained while applying of COA have been compared with those derived from confirmatory experiments which validate the applicability and suitability of the proposed techniques in enhancing the performance of ECM process.

  18. EFICAz2: enzyme function inference by a combined approach enhanced by machine learning

    Directory of Open Access Journals (Sweden)

    Skolnick Jeffrey

    2009-04-01

    Full Text Available Abstract Background We previously developed EFICAz, an enzyme function inference approach that combines predictions from non-completely overlapping component methods. Two of the four components in the original EFICAz are based on the detection of functionally discriminating residues (FDRs. FDRs distinguish between member of an enzyme family that are homofunctional (classified under the EC number of interest or heterofunctional (annotated with another EC number or lacking enzymatic activity. Each of the two FDR-based components is associated to one of two specific kinds of enzyme families. EFICAz exhibits high precision performance, except when the maximal test to training sequence identity (MTTSI is lower than 30%. To improve EFICAz's performance in this regime, we: i increased the number of predictive components and ii took advantage of consensual information from the different components to make the final EC number assignment. Results We have developed two new EFICAz components, analogs to the two FDR-based components, where the discrimination between homo and heterofunctional members is based on the evaluation, via Support Vector Machine models, of all the aligned positions between the query sequence and the multiple sequence alignments associated to the enzyme families. Benchmark results indicate that: i the new SVM-based components outperform their FDR-based counterparts, and ii both SVM-based and FDR-based components generate unique predictions. We developed classification tree models to optimally combine the results from the six EFICAz components into a final EC number prediction. The new implementation of our approach, EFICAz2, exhibits a highly improved prediction precision at MTTSI 2 and KEGG shows that: i when both sources make EC number assignments for the same protein sequence, the assignments tend to be consistent and ii EFICAz2 generates considerably more unique assignments than KEGG. Conclusion Performance benchmarks and the

  19. An implantable VLSI architecture for real time spike sorting in cortically controlled Brain Machine Interfaces.

    Science.gov (United States)

    Aghagolzadeh, Mehdi; Zhang, Fei; Oweiss, Karim

    2010-01-01

    Brain Machine Interface (BMI) systems demand real-time spike sorting to instantaneously decode the spike trains of simultaneously recorded cortical neurons. Real-time spike sorting, however, requires extensive computational power that is not feasible to implement in implantable BMI architectures, thereby requiring transmission of high-bandwidth raw neural data to an external computer. In this work, we describe a miniaturized, low power, programmable hardware module capable of performing this task within the resource constraints of an implantable chip. The module computes a sparse representation of the spike waveforms followed by "smart" thresholding. This cascade restricts the sparse representation to a subset of projections that preserve the discriminative features of neuron-specific spike waveforms. In addition, it further reduces telemetry bandwidth making it feasible to wirelessly transmit only the important biological information to the outside world, thereby improving the efficiency, practicality and viability of BMI systems in clinical applications.

  20. Virtual Machine Support for Many-Core Architectures: Decoupling Abstract from Concrete Concurrency Models

    Directory of Open Access Journals (Sweden)

    Stefan Marr

    2010-02-01

    Full Text Available The upcoming many-core architectures require software developers to exploit concurrency to utilize available computational power. Today's high-level language virtual machines (VMs, which are a cornerstone of software development, do not provide sufficient abstraction for concurrency concepts. We analyze concrete and abstract concurrency models and identify the challenges they impose for VMs. To provide sufficient concurrency support in VMs, we propose to integrate concurrency operations into VM instruction sets. Since there will always be VMs optimized for special purposes, our goal is to develop a methodology to design instruction sets with concurrency support. Therefore, we also propose a list of trade-offs that have to be investigated to advise the design of such instruction sets. As a first experiment, we implemented one instruction set extension for shared memory and one for non-shared memory concurrency. From our experimental results, we derived a list of requirements for a full-grown experimental environment for further research.

  1. Performance evaluation of the machine learning algorithms used in inference mechanism of a medical decision support system.

    Science.gov (United States)

    Bal, Mert; Amasyali, M Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse

    2014-01-01

    The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.

  2. Performance Evaluation of the Machine Learning Algorithms Used in Inference Mechanism of a Medical Decision Support System

    Directory of Open Access Journals (Sweden)

    Mert Bal

    2014-01-01

    Full Text Available The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.

  3. Performance Evaluation of the Machine Learning Algorithms Used in Inference Mechanism of a Medical Decision Support System

    Science.gov (United States)

    Bal, Mert; Amasyali, M. Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse

    2014-01-01

    The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets. PMID:25295291

  4. Inferring planar disorder in close-packed structures via ε-machine spectral reconstruction theory: examples from simulated diffraction patterns.

    Science.gov (United States)

    Varn, D P; Canright, G S; Crutchfield, J P

    2013-07-01

    A previous paper detailed a novel algorithm, ε-machine spectral reconstruction theory (εMSR), that infers pattern and disorder in planar-faulted, close-packed structures directly from X-ray diffraction patterns [Varn et al. (2013). Acta Cryst. A69, 197-206]. Here εMSR is applied to simulated diffraction patterns from four close-packed crystals. It is found that, for stacking structures with a memory length of three or less, εMSR reproduces the statistics of the stacking structure; the result being in the form of a directed graph called an ε-machine. For stacking structures with a memory length larger than three, εMSR returns a model that captures many important features of the original stacking structure. These include multiple stacking faults and multiple crystal structures. Further, it is found that εMSR is able to discover stacking structure in even highly disordered crystals. In order to address issues concerning the long-range order observed in many classes of layered materials, several length parameters are defined, calculable from the ε-machine, and their relevance is discussed.

  5. Implementing Scientific Simulation Codes Highly Tailored for Vector Architectures Using Custom Configurable Computing Machines

    Science.gov (United States)

    Rutishauser, David

    2006-01-01

    The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters

  6. Particle MCMC algorithms and architectures for accelerating inference in state-space models.

    Science.gov (United States)

    Mingas, Grigorios; Bottolo, Leonardo; Bouganis, Christos-Savvas

    2017-04-01

    Particle Markov Chain Monte Carlo (pMCMC) is a stochastic algorithm designed to generate samples from a probability distribution, when the density of the distribution does not admit a closed form expression. pMCMC is most commonly used to sample from the Bayesian posterior distribution in State-Space Models (SSMs), a class of probabilistic models used in numerous scientific applications. Nevertheless, this task is prohibitive when dealing with complex SSMs with massive data, due to the high computational cost of pMCMC and its poor performance when the posterior exhibits multi-modality. This paper aims to address both issues by: 1) Proposing a novel pMCMC algorithm (denoted ppMCMC), which uses multiple Markov chains (instead of the one used by pMCMC) to improve sampling efficiency for multi-modal posteriors, 2) Introducing custom, parallel hardware architectures, which are tailored for pMCMC and ppMCMC. The architectures are implemented on Field Programmable Gate Arrays (FPGAs), a type of hardware accelerator with massive parallelization capabilities. The new algorithm and the two FPGA architectures are evaluated using a large-scale case study from genetics. Results indicate that ppMCMC achieves 1.96x higher sampling efficiency than pMCMC when using sequential CPU implementations. The FPGA architecture of pMCMC is 12.1x and 10.1x faster than state-of-the-art, parallel CPU and GPU implementations of pMCMC and up to 53x more energy efficient; the FPGA architecture of ppMCMC increases these speedups to 34.9x and 41.8x respectively and is 173x more power efficient, bringing previously intractable SSM-based data analyses within reach.

  7. Inferring genetic architecture of complex traits using Bayesian integrative analysis of genome and transcriptiome data

    DEFF Research Database (Denmark)

    Ehsani, Alireza; Sørensen, Peter; Pomp, Daniel;

    2012-01-01

    Background To understand the genetic architecture of complex traits and bridge the genotype-phenotype gap, it is useful to study intermediate -omics data, e.g. the transcriptome. The present study introduces a method for simultaneous quantification of the contributions from single nucleotide...... polymorphisms (SNPs) and transcript abundances in explaining phenotypic variance, using Bayesian whole-omics models. Bayesian mixed models and variable selection models were used and, based on parameter samples from the model posterior distributions, explained variances were further partitioned at the level......-modal distribution of genomic values collapses, when gene expressions are added to the model Conclusions With increased availability of various -omics data, integrative approaches are promising tools for understanding the genetic architecture of complex traits. Partitioning of explained variances at the chromosome...

  8. Thermal Error Modeling of a Machining Center Using Grey System Theory and Adaptive Network-Based Fuzzy Inference System

    Science.gov (United States)

    Wang, Kun-Chieh; Tseng, Pai-Chung; Lin, Kuo-Ming

    Thermal effect on machine tools is a well-recognized problem in an environment of increasing demand for product quality. The performance of a thermal error compensation system typically depends on the accuracy and robustness of the thermal error model. This work presents a novel thermal error model utilizing two mathematic schemes: the grey system theory and the adaptive network-based fuzzy inference system (ANFIS). First, the measured temperature and deformation results are analyzed via the grey system theory to obtain the influence ranking of temperature ascent on thermal drift of spindle. Then, using the highly ranked temperature ascents as inputs for the ANFIS and training these data by the hybrid learning rule, a thermal compensation model is constructed. The grey system theory effectively reduces the number of temperature sensors needed on a machine structure for prediction, and the ANFIS has the advantages of good accuracy and robustness. For testing the performance of proposed ANFIS model, a real-cutting operation test was conducted. Comparison results demonstrate that the modeling schemes of the ANFIS coupled with the grey system theory has good predictive ability.

  9. POGs2: a web portal to facilitate cross-species inferences about protein architecture and function in plants.

    Directory of Open Access Journals (Sweden)

    Michael Tomcal

    Full Text Available The Putative orthologous Groups 2 Database (POGs2 (http://pogs.uoregon.edu/ integrates information about the inferred proteomes of four plant species (Arabidopsis thaliana, Zea mays, Orza sativa, and Populus trichocarpa in a display that facilitates comparisons among orthologs and extrapolation of annotations among species. A single-page view collates key functional data for members of each Putative Orthologous Group (POG: graphical representations of InterPro domains, predicted and established intracellular locations, and imported gene descriptions. The display incorporates POGs predicted by two different algorithms as well as gene trees, allowing users to evaluate the validity of POG memberships. The web interface provides ready access to sequences and alignments of POG members, as well as sequences, alignments, and domain architectures of closely-related paralogs. A simple and flexible search interface permits queries by BLAST and by any combination of gene identifier, keywords, domain names, InterPro identifiers, and intracellular location. The concurrent display of domain architectures for orthologous proteins highlights errors in gene models and false-negatives in domain predictions. The POGs2 layout is also useful for exploring candidate genes identified by transposon tagging, QTL mapping, map-based cloning, and proteomics, and for navigating between orthologous groups that belong to the same gene family.

  10. Internal architecture of the Tuxtla volcanic field, Veracruz, Mexico, inferred from gravity and magnetic data

    Science.gov (United States)

    Espindola, Juan Manuel; Lopez-Loera, Hector; Mena, Manuel; Zamora-Camacho, Araceli

    2016-09-01

    The Tuxtla Volcanic Field (TVF) is a basaltic volcanic field emerging from the plains of the western margin of the Gulf of Mexico in the Mexican State of Veracruz. Separated by hundreds of kilometers from the Trans-Mexican Volcanic Belt to the NW and the Chiapanecan Volcanic Arc to the SE, it stands detached not only in location but also in the composition of its rocks, which are predominantly alkaline. These characteristics make its origin somewhat puzzling. Furthermore, one of the large volcanoes of the field, San Martin Tuxtla, underwent an eruptive period in historical times (CE 1793). Such volcanic activity conveys particular importance to the study of the TVF from the perspective of volcanology and hazard assessment. Despite the above circumstances, few investigations about its internal structure have been reported. In this work, we present analyses of gravity and aeromagnetic data obtained from different sources. We present the complete Bouguer anomaly of the area and its separation into regional and residual components. The aeromagnetic data were processed to yield the reduction to the pole, the analytic signal, and the upward continuation to complete the interpretation of the gravity analyses. Three-dimensional density models of the regional and residual anomalies were obtained by inversion of the gravity signal adding the response of rectangular prisms at the nodes of a regular grid. We obtained a body with a somewhat flattened top at 16 km below sea level from the inversion of the regional. Three separate slender bodies with tops 6 km deep were obtained from the inversion of the residual. The gravity and magnetic anomalies, as well as the inferred source bodies that produce those geophysical anomalies, lie between the Sontecomapan and Catemaco faults, which are proposed as flower structures associated with an inferred deep-seated fault termed the Veracruz Fault. These fault systems along with magma intrusion at the lower crust are necessary features to

  11. The origin of modern metabolic networks inferred from phylogenomic analysis of protein architecture.

    Science.gov (United States)

    Caetano-Anollés, Gustavo; Kim, Hee Shin; Mittenthal, Jay E

    2007-05-29

    Metabolism represents a complex collection of enzymatic reactions and transport processes that convert metabolites into molecules capable of supporting cellular life. Here we explore the origins and evolution of modern metabolism. Using phylogenomic information linked to the structure of metabolic enzymes, we sort out recruitment processes and discover that most enzymatic activities were associated with the nine most ancient and widely distributed protein fold architectures. An analysis of newly discovered functions showed enzymatic diversification occurred early, during the onset of the modern protein world. Most importantly, phylogenetic reconstruction exercises and other evidence suggest strongly that metabolism originated in enzymes with the P-loop hydrolase fold in nucleotide metabolism, probably in pathways linked to the purine metabolic subnetwork. Consequently, the first enzymatic takeover of an ancient biochemistry or prebiotic chemistry was related to the synthesis of nucleotides for the RNA world.

  12. Predicting the academic success of architecture students by pre-enrolment requirement: using machine-learning techniques

    Directory of Open Access Journals (Sweden)

    Ralph Olusola Aluko

    2016-12-01

    Full Text Available In recent years, there has been an increase in the number of applicants seeking admission into architecture programmes. As expected, prior academic performance (also referred to as pre-enrolment requirement is a major factor considered during the process of selecting applicants. In the present study, machine learning models were used to predict academic success of architecture students based on information provided in prior academic performance. Two modeling techniques, namely K-nearest neighbour (k-NN and linear discriminant analysis were applied in the study. It was found that K-nearest neighbour (k-NN outperforms the linear discriminant analysis model in terms of accuracy. In addition, grades obtained in mathematics (at ordinary level examinations had a significant impact on the academic success of undergraduate architecture students. This paper makes a modest contribution to the ongoing discussion on the relationship between prior academic performance and academic success of undergraduate students by evaluating this proposition. One of the issues that emerges from these findings is that prior academic performance can be used as a predictor of academic success in undergraduate architecture programmes. Overall, the developed k-NN model can serve as a valuable tool during the process of selecting new intakes into undergraduate architecture programmes in Nigeria.

  13. On the Use of Machine Learning Methods for Characterization of Contaminant Source Zone Architecture

    Science.gov (United States)

    Zhang, H.; Mendoza-Sanchez, I.; Christ, J.; Miller, E. L.; Abriola, L. M.

    2011-12-01

    Recent research has identified the importance of DNAPL mass distribution in the evolution of down-gradient contaminant plumes and the control of source zone remediation effectiveness. Advances in the management of sites containing DNAPL source zones, however, are currently limited by the difficulty associated with characterizing subsurface DNAPL source zone 'architecture'. Specifically, knowledge of the ganglia to pool ratio (GTP) has been demonstrated useful in the assessment and prediction of system behavior. In this paper, we present an approach to the estimation of a quantity related to GTP, the pool fraction (PF), defined as the percentage of the source zone volume occupied by pools, based on observations of plume concentrations. Here we discuss the development and initial validation of an approach for PF estimation based on machine learning method. The algorithm is constructed in a way that, when given new concentration data, prediction of the PF of the associated source zone is attained. An ideal solution would make use of the concentration signals to estimate a single value for PF. Unfortunately, this problem is not well-posed given the data at our disposal. Thus, we relax the regression approach to one of classification. We quantize pool fraction (i.e., the interval between zero and one) into a number of intervals and employ machine learning methods to use the concentration data to determine the interval containing the PF for a given set of data. This approach is predicated on the assumption that quantities (i.e., features) derived from the concentration data of evolving plumes with similar source zone PFs will in fact be similar to one another. Thus, within the training process we must determine a suitable collection of features and build methods for evaluating and optimizing similarity in features space that results in high accuracy in terms of predicting the correct PF interval. Moreover, the number and boundaries of these intervals must also be

  14. Late Quaternary activity along the Ferrara thrust inferred from stratigraphic architecture and geophysical surveys

    Science.gov (United States)

    Stefani, Marco; Bignardi, Samuel; Caputo, Riccardo; Minarelli, Luca; Abu-Zeid, Nasser; Santarato, Giovanni

    2010-05-01

    Since Late Miocene, the Emilia-Romagna portion of the Po Plain-Adriatic foredeep basin was progressively affected by compressional deformation, due to the northward propagation of the Apennines fold-and-thrust belt. The major tectonic structures within the basin have been recognised and are relatively well known, thanks to the widespread, even if outdated, seismic survey, performed after WW II, for hydrocarbon exploration. More recently, a large amount of surface and shallow-subsurface information has been provided by the CARG geological mapping project. The region therefore provides a valuable opportunity to discuss the genetic relationship between tectonic deformation, eustatic-paleoclimatic fluctuations, and depositional architecture. The activity of blind thrusts and fault-propagation folds induced repeated angular unconformities and impressive lateral variations in the Pliocene-Quaternary stratigraphy, causing thickness changes, from a few metres, close to the Apennines piedmont line, to more than 9 km, in fast subsiding depocenters (e.g. Lido di Savio). In the Ferrara region, the post-Miocene succession ranges from about 4 km, west of Sant'Agostino, to less than 200 m, on the Casaglia anticline, where Late Quaternary fluvial strata rest on Miocene marine marls, with an angular unconformity relationship. In this sector of the Po Plain, the tip-line of the northernmost thrust has been reconstructed north of the Po River (Occhiobello) and is associated with the growth of a large fold (Ferrara-Casaglia anticline), cross-cut by a complex splay of minor backthrusts and reverse faults. The thrust-anticline structure hosts an energy producing geothermal field, whose hydrogeological behaviour is largely influenced by the fracture pattern. The Apennines frontal thrust probably provided the seismic source for the earthquakes that severely damaged Ferrara, during the 1570 a.D. fall season, as documented by the structural damage still visible in many historic buildings (e

  15. An Architecture for Hybrid Manufacturing Combining 3D Printing and CNC Machining

    OpenAIRE

    Marcel Müller; Elmar Wings

    2016-01-01

    Additive manufacturing is one of the key technologies of the 21st century. Additive manufacturing processes are often combined with subtractive manufacturing processes to create hybrid manufacturing because it is useful for manufacturing complex parts, for example, 3D printed sensor systems. Currently, several CNC machines are required for hybrid manufacturing: one machine is required for additive manufacturing and one is required for subtractive manufacturing. Disadvantages of conventional h...

  16. An Architecture for Hybrid Manufacturing Combining 3D Printing and CNC Machining

    OpenAIRE

    Marcel Müller; Elmar Wings

    2016-01-01

    Additive manufacturing is one of the key technologies of the 21st century. Additive manufacturing processes are often combined with subtractive manufacturing processes to create hybrid manufacturing because it is useful for manufacturing complex parts, for example, 3D printed sensor systems. Currently, several CNC machines are required for hybrid manufacturing: one machine is required for additive manufacturing and one is required for subtractive manufacturing. Disadvantages of conventional h...

  17. Design of a real-time open architecture controller for a reconfigurable machine tool

    CSIR Research Space (South Africa)

    Masekamela, I

    2008-11-01

    Full Text Available The paper presents the design and the development of a real-time, open architecture controller that is used for control of reconfigurable manufacturing tools (RMTs) in reconfigurable manufacturing systems (RMS). The controller that is presented can...

  18. Hybrid Optical Inference Machines

    Science.gov (United States)

    1991-09-27

    imaging. A PC controlled data acquistion system with digital to analog-output was setup with serial- 66 - I controlled linear translation and rotation...rules in Eq. (4) of specific conclusions which are logica ,, .ferred form-the knowledge base. from the facts and rules in response to the queries. In...error rates, digital S 0ucries Conclusions optical signals (binary intensity levels) are assumed for all input and output signals in the optical

  19. Balance in machine architecture: Bandwidth on board and offboard, integer/control speed and flops versus memory

    Energy Technology Data Exchange (ETDEWEB)

    Fischler, M.

    1992-04-01

    The issues to be addressed here are those of ``balance`` in machine architecture. By this, we mean how much emphasis must be placed on various aspects of the system to maximize its usefulness for physics. There are three components that contribute to the utility of a system: How the machine can be used, how big a problem can be attacked, and what the effective capabilities (power) of the hardware are like. The effective power issue is a matter of evaluating the impact of design decisions trading off architectural features such as memory bandwidth and interprocessor communication capabilities. What is studied is the effect these machine parameters have on how quickly the system can solve desired problems. There is a reasonable method for studying this: One selects a few representative algorithms and computes the impact of changing memory bandwidths, and so forth. The only room for controversy here is in the selection of representative problems. The issue of how big a problem can be attacked boils down to a balance of memory size versus power. Although this is a balance issue it is very different than the effective power situation, because no firm answer can be given at this time. The power to memory ratio is highly problem dependent, and optimizing it requires several pieces of physics input, including: how big a lattice is needed for interesting results; what sort of algorithms are best to use; and how many sweeps are needed to get valid results. We seem to be at the threshold of learning things about these issues, but for now, the memory size issue will necessarily be addressed in terms of best guesses, rules of thumb, and researchers` opinions.

  20. Balance in machine architecture: Bandwidth on board and offboard, integer/control speed and flops versus memory

    Energy Technology Data Exchange (ETDEWEB)

    Fischler, M.

    1992-04-01

    The issues to be addressed here are those of balance'' in machine architecture. By this, we mean how much emphasis must be placed on various aspects of the system to maximize its usefulness for physics. There are three components that contribute to the utility of a system: How the machine can be used, how big a problem can be attacked, and what the effective capabilities (power) of the hardware are like. The effective power issue is a matter of evaluating the impact of design decisions trading off architectural features such as memory bandwidth and interprocessor communication capabilities. What is studied is the effect these machine parameters have on how quickly the system can solve desired problems. There is a reasonable method for studying this: One selects a few representative algorithms and computes the impact of changing memory bandwidths, and so forth. The only room for controversy here is in the selection of representative problems. The issue of how big a problem can be attacked boils down to a balance of memory size versus power. Although this is a balance issue it is very different than the effective power situation, because no firm answer can be given at this time. The power to memory ratio is highly problem dependent, and optimizing it requires several pieces of physics input, including: how big a lattice is needed for interesting results; what sort of algorithms are best to use; and how many sweeps are needed to get valid results. We seem to be at the threshold of learning things about these issues, but for now, the memory size issue will necessarily be addressed in terms of best guesses, rules of thumb, and researchers' opinions.

  1. Computer Security Primer: Systems Architecture, Special Ontology and Cloud Virtual Machines

    Science.gov (United States)

    Waguespack, Leslie J.

    2014-01-01

    With the increasing proliferation of multitasking and Internet-connected devices, security has reemerged as a fundamental design concern in information systems. The shift of IS curricula toward a largely organizational perspective of security leaves little room for focus on its foundation in systems architecture, the computational underpinnings of…

  2. Loi de commande prédictive pour le positionement des axes d'une machine outil à architecture ouverte

    OpenAIRE

    Susanu, Mara; Dumur, Didier; Tournier, Christophe; Lartigue, Claire

    2004-01-01

    National audience; Résumé : Concevoir une CN pour machine-outil selon une structure à architecture ouverte permet d'en améliorer la flexibilité en autorisant l'intégration de modules spécifiques. Dans ce contexte, l'article envisage en premier lieu l'ajout d'un module de commande d'axe basé sur une stratégie prédictive avancée. Cette stratégie incluant une anticipation en boucle fermée s'avère particulièrement performante en termes de suivi de consigne et de facilité d'implantation. Considéra...

  3. An open-source highly scalable web service architecture for the Apertium machine translation engine

    OpenAIRE

    Sánchez-Cartagena, Víctor M.; Pérez-Ortiz, Juan Antonio

    2009-01-01

    Some machine translation services like Google Ajax Language API have become very popular as they make the collaboratively created contents of the web 2.0 available to speakers of many languages. One of the keys of its success is its clear and easy-to-use application programming interface (API) and a scalable and reliable service. This paper describes a highly scalable implementation of an Apertium-based translation web service, that aims to make contents available to speakers of lesser resour...

  4. Executable Architecture of Net Enabled Operations: State Machine of Federated Nodes

    Science.gov (United States)

    2009-11-01

    améliorations. Il est ressorti d’un des ateliers sur le développement d’une machine à états de nœuds fédérés une question importante et c’est la...processus serait plus facile à comprendre s’il était possible de présenter un ou deux modèles durant l’atelier. Les participants de cet atelier ont

  5. An Energy-Efficient and Scalable Deep Learning/Inference Processor With Tetra-Parallel MIMD Architecture for Big Data Applications.

    Science.gov (United States)

    Park, Seong-Wook; Park, Junyoung; Bong, Kyeongryeol; Shin, Dongjoo; Lee, Jinmook; Choi, Sungpill; Yoo, Hoi-Jun

    2015-12-01

    Deep Learning algorithm is widely used for various pattern recognition applications such as text recognition, object recognition and action recognition because of its best-in-class recognition accuracy compared to hand-crafted algorithm and shallow learning based algorithms. Long learning time caused by its complex structure, however, limits its usage only in high-cost servers or many-core GPU platforms so far. On the other hand, the demand on customized pattern recognition within personal devices will grow gradually as more deep learning applications will be developed. This paper presents a SoC implementation to enable deep learning applications to run with low cost platforms such as mobile or portable devices. Different from conventional works which have adopted massively-parallel architecture, this work adopts task-flexible architecture and exploits multiple parallelism to cover complex functions of convolutional deep belief network which is one of popular deep learning/inference algorithms. In this paper, we implement the most energy-efficient deep learning and inference processor for wearable system. The implemented 2.5 mm × 4.0 mm deep learning/inference processor is fabricated using 65 nm 8-metal CMOS technology for a battery-powered platform with real-time deep inference and deep learning operation. It consumes 185 mW average power, and 213.1 mW peak power at 200 MHz operating frequency and 1.2 V supply voltage. It achieves 411.3 GOPS peak performance and 1.93 TOPS/W energy efficiency, which is 2.07× higher than the state-of-the-art.

  6. Semigroup based neural network architecture for extrapolation of mass unbalance for rotating machines in power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B.H.; Velas, J.P.; Lee, K.Y [Pennsylvania State Univ., University Park, PA (United States). Dept. of Electrical Engineering

    2006-07-01

    This paper presented a mathematical method that power plant operators can use to estimate rotational mass unbalance, which is the most common source of vibration in turbine generators. An unbalanced rotor or driveshaft causes vibration and stress in the rotating part and in its supporting structure. As such, balancing the rotating part is important to minimize structural stress, minimize operator annoyance and fatigue, increase bearing life, or minimize power loss. The newly proposed method for estimating vibration on a turbine generator uses mass unbalance extrapolation based on a modified system-type neural network architecture, notably the semigroup theory used to study differential equations, partial differential equations and their combinations. Rather than relying on inaccurate vibration measurements, this method extrapolates a set of reliable mass unbalance readings from a common source of vibration. Given a set of empirical data with no analytic expression, the authors first developed an analytic description and then extended that model along a single axis. The algebraic decomposition which was used to obtain the analytic description of empirical data in the semigroup form involved the product of a coefficient vector and a basis set of vectors. The proposed approach was simulated on empirical data. The concept can also be tested in many other engineering and non-engineering problems. 23 refs., 11 figs.

  7. Hybrid Fuzzy Wavelet Neural Networks Architecture Based on Polynomial Neural Networks and Fuzzy Set/Relation Inference-Based Wavelet Neurons.

    Science.gov (United States)

    Huang, Wei; Oh, Sung-Kwun; Pedrycz, Witold

    2017-08-11

    This paper presents a hybrid fuzzy wavelet neural network (HFWNN) realized with the aid of polynomial neural networks (PNNs) and fuzzy inference-based wavelet neurons (FIWNs). Two types of FIWNs including fuzzy set inference-based wavelet neurons (FSIWNs) and fuzzy relation inference-based wavelet neurons (FRIWNs) are proposed. In particular, a FIWN without any fuzzy set component (viz., a premise part of fuzzy rule) becomes a wavelet neuron (WN). To alleviate the limitations of the conventional wavelet neural networks or fuzzy wavelet neural networks whose parameters are determined based on a purely random basis, the parameters of wavelet functions standing in FIWNs or WNs are initialized by using the C-Means clustering method. The overall architecture of the HFWNN is similar to the one of the typical PNNs. The main strategies in the design of HFWNN are developed as follows. First, the first layer of the network consists of FIWNs (e.g., FSIWN or FRIWN) that are used to reflect the uncertainty of data, while the second and higher layers consist of WNs, which exhibit a high level of flexibility and realize a linear combination of wavelet functions. Second, the parameters used in the design of the HFWNN are adjusted through genetic optimization. To evaluate the performance of the proposed HFWNN, several publicly available data are considered. Furthermore a thorough comparative analysis is covered.

  8. Effective software design and development for the new graph architecture HPC machines.

    Energy Technology Data Exchange (ETDEWEB)

    Dechev, Damian

    2012-03-01

    Software applications need to change and adapt as modern architectures evolve. Nowadays advancement in chip design translates to increased parallelism. Exploiting such parallelism is a major challenge in modern software engineering. Multicore processors are about to introduce a significant change in the way we design and use fundamental data structures. In this work we describe the design and programming principles of a software library of highly concurrent scalable and nonblocking data containers. In this project we have created algorithms and data structures for handling fundamental computations in massively multithreaded contexts, and we have incorporated these into a usable library with familiar look and feel. In this work we demonstrate the first design and implementation of a wait-free hash table. Our multiprocessor data structure design allows a large number of threads to concurrently insert, remove, and retrieve information. Non-blocking designs alleviate the problems traditionally associated with the use of mutual exclusion, such as bottlenecks and thread-safety. Lock-freedom provides the ability to share data without some of the drawbacks associated with locks, however, these designs remain susceptible to starvation. Furthermore, wait-freedom provides all of the benefits of lock-free synchronization with the added assurance that every thread makes progress in a finite number of steps. This implies deadlock-freedom, livelock-freedom, starvation-freedom, freedom from priority inversion, and thread-safety. The challenges of providing the desirable progress and correctness guarantees of wait-free objects makes their design and implementation difficult. There are few wait-free data structures described in the literature. Using only standard atomic operations provided by the hardware, our design is portable; therefore, it is applicable to a variety of data-intensive applications including the domains of embedded systems and supercomputers.Our experimental

  9. TITAN: inference of copy number architectures in clonal cell populations from tumor whole-genome sequence data

    Science.gov (United States)

    Roth, Andrew; Khattra, Jaswinder; Ho, Julie; Yap, Damian; Prentice, Leah M.; Melnyk, Nataliya; McPherson, Andrew; Bashashati, Ali; Laks, Emma; Biele, Justina; Ding, Jiarui; Le, Alan; Rosner, Jamie; Shumansky, Karey; Marra, Marco A.; Gilks, C. Blake; Huntsman, David G.; McAlpine, Jessica N.; Aparicio, Samuel

    2014-01-01

    The evolution of cancer genomes within a single tumor creates mixed cell populations with divergent somatic mutational landscapes. Inference of tumor subpopulations has been disproportionately focused on the assessment of somatic point mutations, whereas computational methods targeting evolutionary dynamics of copy number alterations (CNA) and loss of heterozygosity (LOH) in whole-genome sequencing data remain underdeveloped. We present a novel probabilistic model, TITAN, to infer CNA and LOH events while accounting for mixtures of cell populations, thereby estimating the proportion of cells harboring each event. We evaluate TITAN on idealized mixtures, simulating clonal populations from whole-genome sequences taken from genomically heterogeneous ovarian tumor sites collected from the same patient. In addition, we show in 23 whole genomes of breast tumors that the inference of CNA and LOH using TITAN critically informs population structure and the nature of the evolving cancer genome. Finally, we experimentally validated subclonal predictions using fluorescence in situ hybridization (FISH) and single-cell sequencing from an ovarian cancer patient sample, thereby recapitulating the key modeling assumptions of TITAN. PMID:25060187

  10. OFMspert - Inference of operator intentions in supervisory control using a blackboard architecture. [operator function model expert system

    Science.gov (United States)

    Jones, Patricia S.; Mitchell, Christine M.; Rubin, Kenneth S.

    1988-01-01

    The authors proposes an architecture for an expert system that can function as an operator's associate in the supervisory control of a complex dynamic system. Called OFMspert (operator function model (OFM) expert system), the architecture uses the operator function modeling methodology as the basis for the design. The authors put emphasis on the understanding capabilities, i.e., the intent referencing property, of an operator's associate. The authors define the generic structure of OFMspert, particularly those features that support intent inferencing. They also describe the implementation and validation of OFMspert in GT-MSOCC (Georgia Tech-Multisatellite Operations Control Center), a laboratory domain designed to support research in human-computer interaction and decision aiding in complex, dynamic systems.

  11. Simulating Turing machines on Maurer machines

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2008-01-01

    In a previous paper, we used Maurer machines to model and analyse micro-architectures. In the current paper, we investigate the connections between Turing machines and Maurer machines with the purpose to gain an insight into computability issues relating to Maurer machines. We introduce ways to

  12. Computational capabilities of multilayer committee machines

    Energy Technology Data Exchange (ETDEWEB)

    Neirotti, J P [NCRG, Aston University, Birmingham (United Kingdom); Franco, L, E-mail: j.p.neirotti@aston.ac.u [Depto. de Lenguajes y Ciencias de la Computacion, Universidad de Malaga (Spain)

    2010-11-05

    We obtained an analytical expression for the computational complexity of many layered committee machines with a finite number of hidden layers (L < {infinity}) using the generalization complexity measure introduced by Franco et al (2006) IEEE Trans. Neural Netw. 17 578. Although our result is valid in the large-size limit and for an overlap synaptic matrix that is ultrametric, it provides a useful tool for inferring the appropriate architecture a network must have to reproduce an arbitrary realizable Boolean function.

  13. Inferring Planet Occurrence Rates With a Q1-Q17 Kepler Planet Candidate Catalog Produced by a Machine Learning Classifier

    Science.gov (United States)

    Catanzarite, Joseph; Jenkins, Jon Michael; McCauliff, Sean D.; Burke, Christopher; Bryson, Steve; Batalha, Natalie; Coughlin, Jeffrey; Rowe, Jason; mullally, fergal; thompson, susan; Seader, Shawn; Twicken, Joseph; Li, Jie; morris, robert; smith, jeffrey; haas, michael; christiansen, jessie; Clarke, Bruce

    2015-08-01

    NASA’s Kepler Space Telescope monitored the photometric variations of over 170,000 stars, at half-hour cadence, over its four-year prime mission. The Kepler pipeline calibrates the pixels of the target apertures for each star, produces light curves with simple aperture photometry, corrects for systematic error, and detects threshold-crossing events (TCEs) that may be due to transiting planets. The pipeline estimates planet parameters for all TCEs and computes diagnostics used by the Threshold Crossing Event Review Team (TCERT) to produce a catalog of objects that are deemed either likely transiting planet candidates or false positives.We created a training set from the Q1-Q12 and Q1-Q16 TCERT catalogs and an ensemble of synthetic transiting planets that were injected at the pixel level into all 17 quarters of data, and used it to train a random forest classifier. The classifier uniformly and consistently applies diagnostics developed by the Transiting Planet Search and Data Validation pipeline components and by TCERT to produce a robust catalog of planet candidates.The characteristics of the planet candidates detected by Kepler (planet radius and period) do not reflect the intrinsic planet population. Detection efficiency is a function of SNR, so the set of detected planet candidates is incomplete. Transit detection preferentially finds close-in planets with nearly edge-on orbits and misses planets whose orbital geometry precludes transits. Reliability of the planet candidates must also be considered, as they may be false positives. Errors in detected planet radius and in assumed star properties can also bias inference of intrinsic planet population characteristics.In this work we infer the intrinsic planet population, starting with the catalog of detected planet candidates produced by our random forest classifier, and accounting for detection biases and reliabilities as well as for radius errors in the detected population.Kepler was selected as the 10th mission

  14. A model for Intelligent Random Access Memory architecture (IRAM) cellular automata algorithms on the Associative String Processing machine (ASTRA)

    CERN Document Server

    Rohrbach, F; Vesztergombi, G

    1997-01-01

    In the near future, the computer performance will be completely determined by how long it takes to access memory. There are bottle-necks in memory latency and memory-to processor interface bandwidth. The IRAM initiative could be the answer by putting Processor-In-Memory (PIM). Starting from the massively parallel processing concept, one reached a similar conclusion. The MPPC (Massively Parallel Processing Collaboration) project and the 8K processor ASTRA machine (Associative String Test bench for Research \\& Applications) developed at CERN \\cite{kuala} can be regarded as a forerunner of the IRAM concept. The computing power of the ASTRA machine, regarded as an IRAM with 64 one-bit processors on a 64$\\times$64 bit-matrix memory chip machine, has been demonstrated by running statistical physics algorithms: one-dimensional stochastic cellular automata, as a simple model for dynamical phase transitions. As a relevant result for physics, the damage spreading of this model has been investigated.

  15. Machine learning on-a-chip: a high-performance low-power reusable neuron architecture for artificial neural networks in ECG classifications.

    Science.gov (United States)

    Sun, Yuwen; Cheng, Allen C

    2012-07-01

    Artificial neural networks (ANNs) are a promising machine learning technique in classifying non-linear electrocardiogram (ECG) signals and recognizing abnormal patterns suggesting risks of cardiovascular diseases (CVDs). In this paper, we propose a new reusable neuron architecture (RNA) enabling a performance-efficient and cost-effective silicon implementation for ANN. The RNA architecture consists of a single layer of physical RNA neurons, each of which is designed to use minimal hardware resource (e.g., a single 2-input multiplier-accumulator is used to compute the dot product of two vectors). By carefully applying the principal of time sharing, RNA can multiplexs this single layer of physical neurons to efficiently execute both feed-forward and back-propagation computations of an ANN while conserving the area and reducing the power dissipation of the silicon. A three-layer 51-30-12 ANN is implemented in RNA to perform the ECG classification for CVD detection. This RNA hardware also allows on-chip automatic training update. A quantitative design space exploration in area, power dissipation, and execution speed between RNA and three other implementations representative of different reusable hardware strategies is presented and discussed. Compared with an equivalent software implementation in C executed on an embedded microprocessor, the RNA ASIC achieves three orders of magnitude improvements in both the execution speed and the energy efficiency.

  16. Analyse de la rigidit\\'e des machines outils 3 axes d'architecture parall\\`ele hyperstatique

    CERN Document Server

    Pashkevich, Anatoly; Wenger, Philippe

    2008-01-01

    The paper presents a new stiffness modelling method for overconstrained parallel manipulators, which is applied to 3-d.o.f. translational mechanisms. It is based on a multidimensional lumped-parameter model that replaces the link flexibility by localized 6-d.o.f. virtual springs. In contrast to other works, the method includes a FEA-based link stiffness evaluation and employs a new solution strategy of the kinetostatic equations, which allows computing the stiffness matrix for the overconstrained architectures and for the singular manipulator postures. The advantages of the developed technique are confirmed by application examples, which deal with comparative stiffness analysis of two translational parallel manipulators.

  17. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  18. Causal inference in economics and marketing.

    Science.gov (United States)

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  19. Ecological Inference

    Science.gov (United States)

    King, Gary; Rosen, Ori; Tanner, Martin A.

    2004-09-01

    This collection of essays brings together a diverse group of scholars to survey the latest strategies for solving ecological inference problems in various fields. The last half-decade has witnessed an explosion of research in ecological inference--the process of trying to infer individual behavior from aggregate data. Although uncertainties and information lost in aggregation make ecological inference one of the most problematic types of research to rely on, these inferences are required in many academic fields, as well as by legislatures and the Courts in redistricting, by business in marketing research, and by governments in policy analysis.

  20. Open architecture CNC system

    Energy Technology Data Exchange (ETDEWEB)

    Tal, J. [Galil Motion Control Inc., Sunnyvale, CA (United States); Lopez, A.; Edwards, J.M. [Los Alamos National Lab., NM (United States)

    1995-04-01

    In this paper, an alternative solution to the traditional CNC machine tool controller has been introduced. Software and hardware modules have been described and their incorporation in a CNC control system has been outlined. This type of CNC machine tool controller demonstrates that technology is accessible and can be readily implemented into an open architecture machine tool controller. Benefit to the user is greater controller flexibility, while being economically achievable. PC based, motion as well as non-motion features will provide flexibility through a Windows environment. Up-grading this type of controller system through software revisions will keep the machine tool in a competitive state with minimal effort. Software and hardware modules are mass produced permitting competitive procurement and incorporation. Open architecture CNC systems provide diagnostics thus enhancing maintainability, and machine tool up-time. A major concern of traditional CNC systems has been operator training time. Training time can be greatly minimized by making use of Windows environment features.

  1. In-Situ and Remote-Sensing Data Fusion Using Machine Learning Techniques to Infer Urban and Fire Related Pollution Plumes

    Science.gov (United States)

    Russell, P. B.; Segal-Rozenhaimer, M.; Schmid, B.; Redemann, J.; Livingston, J. M.; Flynn, C.J.; Johnson, R. R.; Dunagan, S. E.; Shinozuka, Y.; Kacenelenbogen, M.; Chatfield, R. B.

    2014-01-01

    Airmass type characterization is key in understanding the relative contribution of various emission sources to atmospheric composition and air quality and can be useful in bottom-up model validation and emission inventories. However, classification of pollution plumes from space is often not trivial. Sub-orbital campaigns, such as SEAC4RS (Studies of Emissions, Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys) give us a unique opportunity to study atmospheric composition in detail, by using a vast suite of in-situ instruments for the detection of trace gases and aerosols. These measurements allow identification of spatial and temporal atmospheric composition changes due to various pollution plumes resulting from urban, biogenic and smoke emissions. Nevertheless, to transfer the knowledge gathered from such campaigns into a global spatial and temporal context, there is a need to develop workflow that can be applicable to measurements from space. In this work we rely on sub-orbital in-situ and total column remote sensing measurements of various pollution plumes taken aboard the NASA DC-8 during 2013 SEAC4RS campaign, linking them through a neural-network (NN) algorithm to allow inference of pollution plume types by input of columnar aerosol and trace-gas measurements. In particular, we use the 4STAR (Spectrometer for Sky-Scanning, Sun-Tracking Atmospheric Research) airborne measurements of wavelength dependent aerosol optical depth (AOD), particle size proxies, O3, NO2 and water vapor to classify different pollution plumes. Our method relies on assigning a-priori ground-truth labeling to the various plumes, which include urban pollution, different fire types (i.e. forest and agriculture) and fire stage (i.e. fresh and aged) using cluster analysis of aerosol and trace-gases in-situ and auxiliary (e.g. trajectory) data and the training of a NN scheme to fit the best prediction parameters using 4STAR measurements as input. We explore our

  2. Causal inference in economics and marketing

    Science.gov (United States)

    Varian, Hal R.

    2016-01-01

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual—a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference. PMID:27382144

  3. Causal inference

    Directory of Open Access Journals (Sweden)

    Richard Shoemaker

    2014-04-01

    Full Text Available Establishing causality has been a problem throughout history of philosophy of science. This paper discusses the philosophy of causal inference along the different school of thoughts and methods: Rationalism, Empiricism, Inductive method, Hypothetical deductive method with pros and cons. The article it starting from the Problem of Hume, also close to the positions of Russell, Carnap, Popper and Kuhn to better understand the modern interpretation and implications of causal inference in epidemiological research.

  4. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...... that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... the autonomy of architecture, not as an esoteric concept but as a valid source of information in a pragmatic design practice, may help us overcome the often-proclaimed dichotomy between formal autonomy and a societally committed architecture. It follows that in architectural education there can be a close...

  5. Real-time context aware reasoning in on-board intelligent traffic systems: An Architecture for Ontology-based Reasoning using Finite State Machines

    NARCIS (Netherlands)

    Stoter, Arjan; Dalmolen, Simon; Drenth, Eduard; Cornelisse, Erik; Mulder, Wico

    2011-01-01

    In-vehicle information management is vital in intelligent traffic systems. In this paper we motivate an architecture for ontology-based context-aware reasoning for in-vehicle information management. An ontology is essential for system standardization and communication, and ontology-based reasoning

  6. Definitive Consensus for Distributed Data Inference

    OpenAIRE

    2011-01-01

    Inference from data is of key importance in many applications of informatics. The current trend in performing such a task of inference from data is to utilise machine learning algorithms. Moreover, in many applications that it is either required or is preferable to infer from the data in a distributed manner. Many practical difficulties arise from the fact that in many distributed applications we avert from transferring data or parts of it due to cost...

  7. Machine Learning Markets

    CERN Document Server

    Storkey, Amos

    2011-01-01

    Prediction markets show considerable promise for developing flexible mechanisms for machine learning. Here, machine learning markets for multivariate systems are defined, and a utility-based framework is established for their analysis. This differs from the usual approach of defining static betting functions. It is shown that such markets can implement model combination methods used in machine learning, such as product of expert and mixture of expert approaches as equilibrium pricing models, by varying agent utility functions. They can also implement models composed of local potentials, and message passing methods. Prediction markets also allow for more flexible combinations, by combining multiple different utility functions. Conversely, the market mechanisms implement inference in the relevant probabilistic models. This means that market mechanism can be utilized for implementing parallelized model building and inference for probabilistic modelling.

  8. When Machines Design Machines!

    DEFF Research Database (Denmark)

    2011-01-01

    Until recently we were the sole designers, alone in the driving seat making all the decisions. But, we have created a world of complexity way beyond human ability to understand, control, and govern. Machines now do more trades than humans on stock markets, they control our power, water, gas...... and food supplies, manage our elevators, microclimates, automobiles and transport systems, and manufacture almost everything. It should come as no surprise that machines are now designing machines. The chips that power our computers and mobile phones, the robots and commercial processing plants on which we...... depend, all are now largely designed by machines. So what of us - will be totally usurped, or are we looking at a new symbiosis with human and artificial intelligences combined to realise the best outcomes possible. In most respects we have no choice! Human abilities alone cannot solve any of the major...

  9. Machines and Metaphors

    Directory of Open Access Journals (Sweden)

    Ángel Martínez García-Posada

    2016-10-01

    Full Text Available The edition La ley del reloj. Arquitectura, máquinas y cultura moderna (Cátedra, Madrid, 2016 registers the useful paradox of the analogy between architecture and technique. Its author, the architect Eduardo Prieto, also a philosopher, professor and writer, acknowledges the obvious distance from machines to buildings, so great that it can only be solved using strange comparisons, since architecture does not move nor are the machines habitable, however throughout the book, from the origin of the metaphor of the machine, with clarity in his essay and enlightening erudition, he points out with certainty some concomitances of high interest, drawing throughout history a beautiful cartography of the fruitful encounter between organics and mechanics.

  10. Engineering molecular machines

    Science.gov (United States)

    Erman, Burak

    2016-04-01

    Biological molecular motors use chemical energy, mostly in the form of ATP hydrolysis, and convert it to mechanical energy. Correlated thermal fluctuations are essential for the function of a molecular machine and it is the hydrolysis of ATP that modifies the correlated fluctuations of the system. Correlations are consequences of the molecular architecture of the protein. The idea that synthetic molecular machines may be constructed by designing the proper molecular architecture is challenging. In their paper, Sarkar et al (2016 New J. Phys. 18 043006) propose a synthetic molecular motor based on the coarse grained elastic network model of proteins and show by numerical simulations that motor function is realized, ranging from deterministic to thermal, depending on temperature. This work opens up a new range of possibilities of molecular architecture based engine design.

  11. Architectural Prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    ' concerns with respect to a system under development. An architectural prototype is primarily a learning and communication vehicle used to explore and experiment with alternative architectural styles, features, and patterns in order to balance different architectural qualities. The use of architectural......A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders...

  12. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...... the obligation to prepare students to perform in a profession that is largely defined by forces outside that discipline. It will be proposed that the autonomy of architecture can be understood as a unique kind of information: as architecture’s self-reliance or knowledge-about itself. A knowledge...... that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...

  13. Foundations of microprogramming architecture, software and applications

    CERN Document Server

    Agrawala, Ashok K

    1976-01-01

    Foundations of Microprogramming: Architecture, Software, and Applications discusses the foundations and trends in microprogramming, focusing on the architectural, software, and application aspects of microprogramming. The book reviews microprocessors, microprogramming concepts, and characteristics, as well as the architectural features in microprogrammed computers. The text explains support software and the different hierarchies or levels of languages. These include assembler languages which are mnemonic or symbolic representation of machine commands; the procedure oriented machine-dependent;

  14. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  15. High-level language computer architecture

    CERN Document Server

    Chu, Yaohan

    1975-01-01

    High-Level Language Computer Architecture offers a tutorial on high-level language computer architecture, including von Neumann architecture and syntax-oriented architecture as well as direct and indirect execution architecture. Design concepts of Japanese-language data processing systems are discussed, along with the architecture of stack machines and the SYMBOL computer system. The conceptual design of a direct high-level language processor is also described.Comprised of seven chapters, this book first presents a classification of high-level language computer architecture according to the pr

  16. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... the obligation to prepare students to perform in a profession that is largely defined by forces outside that discipline. It will be proposed that the autonomy of architecture can be understood as a unique kind of information: as architecture’s self-reliance or knowledge-about itself. A knowledge...... be transformed and reapplied endlessly through its confrontation with shifting information from outside the realms of architecture. A selection of architects’ statements on their own work will be used to demonstrate how in quite diverse contemporary practices the re-use of existing architectures is applied...

  17. Turing Automata and Graph Machines

    Directory of Open Access Journals (Sweden)

    Miklós Bartha

    2010-06-01

    Full Text Available Indexed monoidal algebras are introduced as an equivalent structure for self-dual compact closed categories, and a coherence theorem is proved for the category of such algebras. Turing automata and Turing graph machines are defined by generalizing the classical Turing machine concept, so that the collection of such machines becomes an indexed monoidal algebra. On the analogy of the von Neumann data-flow computer architecture, Turing graph machines are proposed as potentially reversible low-level universal computational devices, and a truly reversible molecular size hardware model is presented as an example.

  18. The Structure of a BamA-BamD Fusion Illuminates the Architecture of the β-Barrel Assembly Machine Core.

    Science.gov (United States)

    Bergal, Hans Thor; Hopkins, Alex Hunt; Metzner, Sandra Ines; Sousa, Marcelo Carlos

    2016-02-01

    The β-barrel assembly machine (BAM) mediates folding and insertion of integral β-barrel outer membrane proteins (OMPs) in Gram-negative bacteria. Of the five BAM subunits, only BamA and BamD are essential for cell viability. Here we present the crystal structure of a fusion between BamA POTRA4-5 and BamD from Rhodothermus marinus. The POTRA5 domain binds BamD between its tetratricopeptide repeats 3 and 4. The interface structural elements are conserved in the Escherichia coli proteins, which allowed structure validation by mutagenesis and disulfide crosslinking in E. coli. Furthermore, the interface is consistent with previously reported mutations that impair BamA-BamD binding. The structure serves as a linchpin to generate a BAM model where POTRA domains and BamD form an elongated periplasmic ring adjacent to the membrane with a central cavity approximately 30 × 60 Å wide. We propose that nascent OMPs bind this periplasmic ring prior to insertion and folding by BAM.

  19. Machine Learning in the Big Data Era: Are We There Yet?

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas Rangan [ORNL

    2014-01-01

    In this paper, we discuss the machine learning challenges of the Big Data era. We observe that recent innovations in being able to collect, access, organize, integrate, and query massive amounts of data from a wide variety of data sources have brought statistical machine learning under more scrutiny and evaluation for gleaning insights from the data than ever before. In that context, we pose and debate the question - Are machine learning algorithms scaling with the ability to store and compute? If yes, how? If not, why not? We survey recent developments in the state-of-the-art to discuss emerging and outstanding challenges in the design and implementation of machine learning algorithms at scale. We leverage experience from real-world Big Data knowledge discovery projects across domains of national security and healthcare to suggest our efforts be focused along the following axes: (i) the data science challenge - designing scalable and flexible computational architectures for machine learning (beyond just data-retrieval); (ii) the science of data challenge the ability to understand characteristics of data before applying machine learning algorithms and tools; and (iii) the scalable predictive functions challenge the ability to construct, learn and infer with increasing sample size, dimensionality, and categories of labels. We conclude with a discussion of opportunities and directions for future research.

  20. Learning thermodynamics with Boltzmann machines

    Science.gov (United States)

    Torlai, Giacomo; Melko, Roger G.

    2016-10-01

    A Boltzmann machine is a stochastic neural network that has been extensively used in the layers of deep architectures for modern machine learning applications. In this paper, we develop a Boltzmann machine that is capable of modeling thermodynamic observables for physical systems in thermal equilibrium. Through unsupervised learning, we train the Boltzmann machine on data sets constructed with spin configurations importance sampled from the partition function of an Ising Hamiltonian at different temperatures using Monte Carlo (MC) methods. The trained Boltzmann machine is then used to generate spin states, for which we compare thermodynamic observables to those computed by direct MC sampling. We demonstrate that the Boltzmann machine can faithfully reproduce the observables of the physical system. Further, we observe that the number of neurons required to obtain accurate results increases as the system is brought close to criticality.

  1. Architectural slicing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2013-01-01

    a system and a slicing criterion, architectural slicing produces an architectural prototype that contain the elements in the architecture that are dependent on the ele- ments in the slicing criterion. Furthermore, we present an initial design and implementation of an architectural slicer for Java.......Architectural prototyping is a widely used practice, con- cerned with taking architectural decisions through experiments with light- weight implementations. However, many architectural decisions are only taken when systems are already (partially) implemented. This is prob- lematic in the context...... of architectural prototyping since experiments with full systems are complex and expensive and thus architectural learn- ing is hindered. In this paper, we propose a novel technique for harvest- ing architectural prototypes from existing systems, \\architectural slic- ing", based on dynamic program slicing. Given...

  2. Architectural Slicing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2013-01-01

    a system and a slicing criterion, architectural slicing produces an architectural prototype that contain the elements in the architecture that are dependent on the ele- ments in the slicing criterion. Furthermore, we present an initial design and implementation of an architectural slicer for Java.......Architectural prototyping is a widely used practice, con- cerned with taking architectural decisions through experiments with light- weight implementations. However, many architectural decisions are only taken when systems are already (partially) implemented. This is prob- lematic in the context...... of architectural prototyping since experiments with full systems are complex and expensive and thus architectural learn- ing is hindered. In this paper, we propose a novel technique for harvest- ing architectural prototypes from existing systems, \\architectural slic- ing", based on dynamic program slicing. Given...

  3. Controlled English to facilitate human/machine analytical processing

    Science.gov (United States)

    Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien

    2013-06-01

    Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.

  4. Machine Learning for Security

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Applied statistics, aka ‘Machine Learning’, offers a wealth of techniques for answering security questions. It’s a much hyped topic in the big data world, with many companies now providing machine learning as a service. This talk will demystify these techniques, explain the math, and demonstrate their application to security problems. The presentation will include how-to’s on classifying malware, looking into encrypted tunnels, and finding botnets in DNS data. About the speaker Josiah is a security researcher with HP TippingPoint DVLabs Research Group. He has over 15 years of professional software development experience. Josiah used to do AI, with work focused on graph theory, search, and deductive inference on large knowledge bases. As rules only get you so far, he moved from AI to using machine learning techniques identifying failure modes in email traffic. There followed digressions into clustered data storage and later integrated control systems. Current ...

  5. Human brain lesion-deficit inference remapped.

    Science.gov (United States)

    Mah, Yee-Haur; Husain, Masud; Rees, Geraint; Nachev, Parashkev

    2014-09-01

    Our knowledge of the anatomical organization of the human brain in health and disease draws heavily on the study of patients with focal brain lesions. Historically the first method of mapping brain function, it is still potentially the most powerful, establishing the necessity of any putative neural substrate for a given function or deficit. Great inferential power, however, carries a crucial vulnerability: without stronger alternatives any consistent error cannot be easily detected. A hitherto unexamined source of such error is the structure of the high-dimensional distribution of patterns of focal damage, especially in ischaemic injury-the commonest aetiology in lesion-deficit studies-where the anatomy is naturally shaped by the architecture of the vascular tree. This distribution is so complex that analysis of lesion data sets of conventional size cannot illuminate its structure, leaving us in the dark about the presence or absence of such error. To examine this crucial question we assembled the largest known set of focal brain lesions (n = 581), derived from unselected patients with acute ischaemic injury (mean age = 62.3 years, standard deviation = 17.8, male:female ratio = 0.547), visualized with diffusion-weighted magnetic resonance imaging, and processed with validated automated lesion segmentation routines. High-dimensional analysis of this data revealed a hidden bias within the multivariate patterns of damage that will consistently distort lesion-deficit maps, displacing inferred critical regions from their true locations, in a manner opaque to replication. Quantifying the size of this mislocalization demonstrates that past lesion-deficit relationships estimated with conventional inferential methodology are likely to be significantly displaced, by a magnitude dependent on the unknown underlying lesion-deficit relationship itself. Past studies therefore cannot be retrospectively corrected, except by new knowledge that would render them redundant

  6. Interaction with Machine Improvisation

    Science.gov (United States)

    Assayag, Gerard; Bloch, George; Cont, Arshia; Dubnov, Shlomo

    We describe two multi-agent architectures for an improvisation oriented musician-machine interaction systems that learn in real time from human performers. The improvisation kernel is based on sequence modeling and statistical learning. We present two frameworks of interaction with this kernel. In the first, the stylistic interaction is guided by a human operator in front of an interactive computer environment. In the second framework, the stylistic interaction is delegated to machine intelligence and therefore, knowledge propagation and decision are taken care of by the computer alone. The first framework involves a hybrid architecture using two popular composition/performance environments, Max and OpenMusic, that are put to work and communicate together, each one handling the process at a different time/memory scale. The second framework shares the same representational schemes with the first but uses an Active Learning architecture based on collaborative, competitive and memory-based learning to handle stylistic interactions. Both systems are capable of processing real-time audio/video as well as MIDI. After discussing the general cognitive background of improvisation practices, the statistical modelling tools and the concurrent agent architecture are presented. Then, an Active Learning scheme is described and considered in terms of using different improvisation regimes for improvisation planning. Finally, we provide more details about the different system implementations and describe several performances with the system.

  7. Architectural Prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders......' concerns with respect to a system under development. An architectural prototype is primarily a learning and communication vehicle used to explore and experiment with alternative architectural styles, features, and patterns in order to balance different architectural qualities. The use of architectural...... prototypes in the development process is discussed, and we argue that such prototypes can play a role throughout the entire process. The use of architectural prototypes is illustrated by three distinct cases of creating software systems. We argue that architectural prototyping can provide key insights...

  8. Architectural prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders......' concerns with respect to a system under development. An architectural prototype is primarily a learning and communication vehicle used to explore and experiment with alternative architectural styles, features, and patterns in order to balance different architectural qualities. The use of architectural...... prototypes in the development process is discussed, and we argue that such prototypes can play a role throughout the entire process. The use of architectural prototypes is illustrated by three distinct cases of creating software systems. We argue that architectural prototyping can provide key insights...

  9. Machine Translation

    Institute of Scientific and Technical Information of China (English)

    张严心

    2015-01-01

    As a kind of ancillary translation tool, Machine Translation has been paid increasing attention to and received different kinds of study by a great deal of researchers and scholars for a long time. To know the definition of Machine Translation and to analyse its benefits and problems are significant for translators in order to make good use of Machine Translation, and helpful to develop and consummate Machine Translation Systems in the future.

  10. Sustainable machining

    CERN Document Server

    2017-01-01

    This book provides an overview on current sustainable machining. Its chapters cover the concept in economic, social and environmental dimensions. It provides the reader with proper ways to handle several pollutants produced during the machining process. The book is useful on both undergraduate and postgraduate levels and it is of interest to all those working with manufacturing and machining technology.

  11. Machine performance assessment and enhancement for a hexapod machine

    Energy Technology Data Exchange (ETDEWEB)

    Mou, J.I. [Arizona State Univ., Tempe, AZ (United States); King, C. [Sandia National Labs., Livermore, CA (United States). Integrated Manufacturing Systems Center

    1998-03-19

    The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess the status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.

  12. Robotic architectures

    CSIR Research Space (South Africa)

    Mtshali, M

    2010-01-01

    Full Text Available In the development of mobile robotic systems, a robotic architecture plays a crucial role in interconnecting all the sub-systems and controlling the system. The design of robotic architectures for mobile autonomous robots is a challenging...

  13. Robotic Architectures

    Directory of Open Access Journals (Sweden)

    Mbali Mtshali

    2010-01-01

    Full Text Available In the development of mobile robotic systems, a robotic architecture plays a crucial role in interconnecting all the sub-systems and controlling the system. The design of robotic architectures for mobile autonomous robots is a challenging and complex task. With a number of existing architectures and tools to choose from, a review of the existing robotic architecture is essential. This paper surveys the different paradigms in robotic architectures. A classification of the existing robotic architectures and comparison of different proposals attributes and properties have been carried out. The paper also provides a view on the current state of designing robot architectures. It also proposes a conceptual model of a generalised robotic architecture for mobile autonomous robots.Defence Science Journal, 2010, 60(1, pp.15-22, DOI:http://dx.doi.org/10.14429/dsj.60.96

  14. Study of Virtual Machine and its application

    Directory of Open Access Journals (Sweden)

    Rohaan Chandra

    2013-07-01

    Full Text Available A virtual machine is software that’s capable of executing programs as if it were a physical machine—it’s a computer within a computer. A virtual machine (VM is a software implemented abstraction of the underlying hardware, which is presented to the application layer of the system. Virtual machines may be based on specifications of a hypothetical computer or emulate the computer architecture and functions of a real world computer.

  15. Architecture & Environment

    Science.gov (United States)

    Erickson, Mary; Delahunt, Michael

    2010-01-01

    Most art teachers would agree that architecture is an important form of visual art, but they do not always include it in their curriculums. In this article, the authors share core ideas from "Architecture and Environment," a teaching resource that they developed out of a long-term interest in teaching architecture and their fascination with the…

  16. Architecture & Environment

    Science.gov (United States)

    Erickson, Mary; Delahunt, Michael

    2010-01-01

    Most art teachers would agree that architecture is an important form of visual art, but they do not always include it in their curriculums. In this article, the authors share core ideas from "Architecture and Environment," a teaching resource that they developed out of a long-term interest in teaching architecture and their fascination with the…

  17. Modular analytics management architecture for interoperability and decision support

    Science.gov (United States)

    Marotta, Stephen; Metzger, Max; Gorman, Joe; Sliva, Amy

    2016-05-01

    The Dual Node Decision Wheels (DNDW) architecture is a new approach to information fusion and decision support systems. By combining cognitive systems engineering organizational analysis tools, such as decision trees, with the Dual Node Network (DNN) technical architecture for information fusion, the DNDW can align relevant data and information products with an organization's decision-making processes. In this paper, we present the Compositional Inference and Machine Learning Environment (CIMLE), a prototype framework based on the principles of the DNDW architecture. CIMLE provides a flexible environment so heterogeneous data sources, messaging frameworks, and analytic processes can interoperate to provide the specific information required for situation understanding and decision making. It was designed to support the creation of modular, distributed solutions over large monolithic systems. With CIMLE, users can repurpose individual analytics to address evolving decision-making requirements or to adapt to new mission contexts; CIMLE's modular design simplifies integration with new host operating environments. CIMLE's configurable system design enables model developers to build analytical systems that closely align with organizational structures and processes and support the organization's information needs.

  18. Simple machines

    CERN Document Server

    Graybill, George

    2007-01-01

    Just how simple are simple machines? With our ready-to-use resource, they are simple to teach and easy to learn! Chocked full of information and activities, we begin with a look at force, motion and work, and examples of simple machines in daily life are given. With this background, we move on to different kinds of simple machines including: Levers, Inclined Planes, Wedges, Screws, Pulleys, and Wheels and Axles. An exploration of some compound machines follows, such as the can opener. Our resource is a real time-saver as all the reading passages, student activities are provided. Presented in s

  19. Autonomous forward inference via DNA computing

    Institute of Scientific and Technical Information of China (English)

    Fu Yan; Li Gen; Li Yin; Meng Dazhi

    2007-01-01

    Recent studies direct the researchers into building DNA computing machines with intelligence, which is measured by three main points: autonomous, programmable and able to learn and adapt. Logical inference plays an important role in programmable information processing or computing. Here we present a new method to perform autonomous molecular forward inference for expert system.A novel repetitive recognition site (RRS) technique is invented to design rule-molecules in knowledge base. The inference engine runs autonomously by digesting the rule-molecule, using a Class ⅡB restriction enzyme PpiⅠ. Concentration model has been built to show the feasibility of the inference process under ideal chemical reaction conditions. Moreover, we extend to implement a triggering communication between molecular automata, as a further application of the RRS technique in our model.

  20. Electric machine

    Science.gov (United States)

    El-Refaie, Ayman Mohamed Fawzi [Niskayuna, NY; Reddy, Patel Bhageerath [Madison, WI

    2012-07-17

    An interior permanent magnet electric machine is disclosed. The interior permanent magnet electric machine comprises a rotor comprising a plurality of radially placed magnets each having a proximal end and a distal end, wherein each magnet comprises a plurality of magnetic segments and at least one magnetic segment towards the distal end comprises a high resistivity magnetic material.

  1. Bayesian structural inference for hidden processes.

    Science.gov (United States)

    Strelioff, Christopher C; Crutchfield, James P

    2014-04-01

    We introduce a Bayesian approach to discovering patterns in structurally complex processes. The proposed method of Bayesian structural inference (BSI) relies on a set of candidate unifilar hidden Markov model (uHMM) topologies for inference of process structure from a data series. We employ a recently developed exact enumeration of topological ε-machines. (A sequel then removes the topological restriction.) This subset of the uHMM topologies has the added benefit that inferred models are guaranteed to be ε-machines, irrespective of estimated transition probabilities. Properties of ε-machines and uHMMs allow for the derivation of analytic expressions for estimating transition probabilities, inferring start states, and comparing the posterior probability of candidate model topologies, despite process internal structure being only indirectly present in data. We demonstrate BSI's effectiveness in estimating a process's randomness, as reflected by the Shannon entropy rate, and its structure, as quantified by the statistical complexity. We also compare using the posterior distribution over candidate models and the single, maximum a posteriori model for point estimation and show that the former more accurately reflects uncertainty in estimated values. We apply BSI to in-class examples of finite- and infinite-order Markov processes, as well to an out-of-class, infinite-state hidden process.

  2. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    compares "best practice" in Denmark and "best practice" in Austria. The modern architects accepted the fact that industrialized architecture told the storey of repetition and monotonous as basic condition. This article aims to explain that architecture can be thought as a complex and diverse design through......Architectural freedom and industrialized architecture. Inge Vestergaard, Associate Professor, Cand. Arch. Aarhus School of Architecture, Denmark Noerreport 20, 8000 Aarhus C Telephone +45 89 36 0000 E-mai l inge.vestergaard@aarch.dk Based on the repetitive architecture from the "building boom" 1960...... to 1973 it is discussed how architects can handle these Danish element and montage buildings through the transformation to upgraded aesthetical, functional and energy efficient architecture. The method used is analysis of cases, parallels to literature studies and producer interviews. This analysis...

  3. Spiking neuron network Helmholtz machine.

    Science.gov (United States)

    Sountsov, Pavel; Miller, Paul

    2015-01-01

    An increasing amount of behavioral and neurophysiological data suggests that the brain performs optimal (or near-optimal) probabilistic inference and learning during perception and other tasks. Although many machine learning algorithms exist that perform inference and learning in an optimal way, the complete description of how one of those algorithms (or a novel algorithm) can be implemented in the brain is currently incomplete. There have been many proposed solutions that address how neurons can perform optimal inference but the question of how synaptic plasticity can implement optimal learning is rarely addressed. This paper aims to unify the two fields of probabilistic inference and synaptic plasticity by using a neuronal network of realistic model spiking neurons to implement a well-studied computational model called the Helmholtz Machine. The Helmholtz Machine is amenable to neural implementation as the algorithm it uses to learn its parameters, called the wake-sleep algorithm, uses a local delta learning rule. Our spiking-neuron network implements both the delta rule and a small example of a Helmholtz machine. This neuronal network can learn an internal model of continuous-valued training data sets without supervision. The network can also perform inference on the learned internal models. We show how various biophysical features of the neural implementation constrain the parameters of the wake-sleep algorithm, such as the duration of the wake and sleep phases of learning and the minimal sample duration. We examine the deviations from optimal performance and tie them to the properties of the synaptic plasticity rule.

  4. The Machine within the Machine

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Although Virtual Machines are widespread across CERN, you probably won't have heard of them unless you work for an experiment. Virtual machines - known as VMs - allow you to create a separate machine within your own, allowing you to run Linux on your Mac, or Windows on your Linux - whatever combination you need.   Using a CERN Virtual Machine, a Linux analysis software runs on a Macbook. When it comes to LHC data, one of the primary issues collaborations face is the diversity of computing environments among collaborators spread across the world. What if an institute cannot run the analysis software because they use different operating systems? "That's where the CernVM project comes in," says Gerardo Ganis, PH-SFT staff member and leader of the CernVM project. "We were able to respond to experimentalists' concerns by providing a virtual machine package that could be used to run experiment software. This way, no matter what hardware they have ...

  5. Green Architecture

    Science.gov (United States)

    Lee, Seung-Ho

    Today, the environment has become a main subject in lots of science disciplines and the industrial development due to the global warming. This paper presents the analysis of the tendency of Green Architecture in France on the threes axes: Regulations and Approach for the Sustainable Architecture (Certificate and Standard), Renewable Materials (Green Materials) and Strategies (Equipments) of Sustainable Technology. The definition of 'Green Architecture' will be cited in the introduction and the question of the interdisciplinary for the technological development in 'Green Architecture' will be raised up in the conclusion.

  6. Catalyst Architecture

    DEFF Research Database (Denmark)

    Kiib, Hans; Marling, Gitte; Hansen, Peter Mandal

    2014-01-01

    How can architecture promote the enriching experiences of the tolerant, the democratic, and the learning city - a city worth living in, worth supporting and worth investing in? Catalyst Architecture comprises architectural projects, which, by virtue of their location, context and their combination...... of programs, have a role in mediating positive social and/or cultural development. In this sense, we talk about architecture as a catalyst for: sustainable adaptation of the city’s infrastructure appropriate renovation of dilapidated urban districts strengthening of social cohesiveness in the city development...

  7. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  8. MI-ANFIS: A Multiple Instance Adaptive Neuro-Fuzzy Inference System

    Science.gov (United States)

    2015-08-02

    16. SECURITY CLASSIFICATION OF: 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES 12. DISTRIBUTION AVAILIBILITY STATEMENT 6...Instance AdaptiveNeuro-Fuzzy Inference System We introduce a novel adaptive neuro -fuzzy architecture based on the framework of Multiple Instance Fuzzy...Inference. The new architecture called Multiple Instance-ANFIS (MI-ANFIS), is an extension of the standard Adaptive Neuro Fuzzy Inference System (ANFIS

  9. Machine Learning

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  10. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    Architectural freedom and industrialized architecture. Inge Vestergaard, Associate Professor, Cand. Arch. Aarhus School of Architecture, Denmark Noerreport 20, 8000 Aarhus C Telephone +45 89 36 0000 E-mai l inge.vestergaard@aarch.dk Based on the repetitive architecture from the "building boom" 1960...... compares "best practice" in Denmark and "best practice" in Austria. The modern architects accepted the fact that industrialized architecture told the storey of repetition and monotonous as basic condition. This article aims to explain that architecture can be thought as a complex and diverse design through...... to the building physic problems a new industrialized period has started based on light weight elements basically made of wooden structures, faced with different suitable materials meant for individual expression for the specific housing area. It is the purpose of this article to widen up the different design...

  11. Terra Harvest software architecture

    Science.gov (United States)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  12. Architectural Contestation

    NARCIS (Netherlands)

    Merle, J.

    2012-01-01

    This dissertation addresses the reductive reading of Georges Bataille's work done within the field of architectural criticism and theory which tends to set aside the fundamental ‘broken’ totality of Bataille's oeuvre and also to narrowly interpret it as a mere critique of architectural form, consequ

  13. Local architecture

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Local architecture refers to structures built in the countryside,such as temples,memorial halls,residences, stores,pavilions, bridges,decorated archways, and wells. Because these structures were all built by focal craftsmen and villagers in the traditional local style, they are generally called local architecture.

  14. Architecture Sustainability

    NARCIS (Netherlands)

    Avgeriou, Paris; Stal, Michael; Hilliard, Rich

    2013-01-01

    Software architecture is the foundation of software system development, encompassing a system's architects' and stakeholders' strategic decisions. A special issue of IEEE Software is intended to raise awareness of architecture sustainability issues and increase interest and work in the area. The fir

  15. Architectural geometry

    NARCIS (Netherlands)

    Pottmann, Helmut; Eigensatz, Michael; Vaxman, A.; Wallner, Johannes

    2015-01-01

    Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural

  16. Architectural Contestation

    NARCIS (Netherlands)

    Merle, J.

    2012-01-01

    This dissertation addresses the reductive reading of Georges Bataille's work done within the field of architectural criticism and theory which tends to set aside the fundamental ‘broken’ totality of Bataille's oeuvre and also to narrowly interpret it as a mere critique of architectural form,

  17. Architectural geometry

    NARCIS (Netherlands)

    Pottmann, Helmut; Eigensatz, Michael; Vaxman, A.; Wallner, Johannes

    2015-01-01

    Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural

  18. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen

    This PhD thesis is motived by a personal interest in the theoretical, practical and creative qualities of architecture. But also a wonder and curiosity about the cultural and social relations architecture represents through its occupation with both the sciences and the arts. Inspired by present...... initiatives in Aalborg Hospital to overcome patient undernutrition by refurbishing eating environments, this thesis engages in an investigation of the interior architectural qualities of patient eating environments. The relevance for this holistic perspective, synthesizing health, food and architecture...... environments and a knowledge gap therefore exists in present hospital designs. Consequently, the purpose of this thesis has been to investigate if any research-based knowledge exist supporting the hypothesis that the interior architectural qualities of eating environments influence patient food intake, health...

  19. Systemic Architecture

    DEFF Research Database (Denmark)

    Poletto, Marco; Pasquero, Claudia

    2012-01-01

    design protocols developed to describe the city as a territory of self-organization. Collecting together nearly a decade of design experiments by the authors and their practice, ecoLogicStudio, the book discusses key disciplinary definitions such as ecologic urbanism, algorithmic architecture, bottom......This is a manual investigating the subject of urban ecology and systemic development from the perspective of architectural design. It sets out to explore two main goals: to discuss the contemporary relevance of a systemic practice to architectural design, and to share a toolbox of informational......-up or tactical design, behavioural space and the boundary of the natural and the artificial realms within the city and architecture. A new kind of "real-time world-city" is illustrated in the form of an operational design manual for the assemblage of proto-architectures, the incubation of proto...

  20. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen

    and well-being, as well as outline a set of basic design principles ‘predicting’ the future interior architectural qualities of patient eating environments. Methodologically the thesis is based on an explorative study employing an abductive approach and hermeneutic-interpretative strategy utilizing tactics......This PhD thesis is motived by a personal interest in the theoretical, practical and creative qualities of architecture. But also a wonder and curiosity about the cultural and social relations architecture represents through its occupation with both the sciences and the arts. Inspired by present...... initiatives in Aalborg Hospital to overcome patient undernutrition by refurbishing eating environments, this thesis engages in an investigation of the interior architectural qualities of patient eating environments. The relevance for this holistic perspective, synthesizing health, food and architecture...

  1. Architectural Narratives

    DEFF Research Database (Denmark)

    2010-01-01

    and architectural heritage; another group tries to embed new performative technologies in expressive architectural representation. Finally, this essay provides a theoretical framework for the analysis of the political rationales of these projects and for the architectural representation bridges the gap between......In this essay, I focus on the combination of programs and the architecture of cultural projects that have emerged within the last few years. These projects are characterized as “hybrid cultural projects,” because they intend to combine experience with entertainment, play, and learning. This essay...... identifies new rationales related to this development, and it argues that “cultural planning” has increasingly shifted its focus from a cultural institutional approach to a more market-oriented strategy that integrates art and business. The role of architecture has changed, too. It not only provides...

  2. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen

    , is the current building of a series of Danish ‘super hospitals’ and an increased focus among architectural practices on research-based knowledge produced with the architectural sub-disciplines Healing Architecture and Evidence-Based Design. The problem is that this research does not focus on patient eating...... environments and a knowledge gap therefore exists in present hospital designs. Consequently, the purpose of this thesis has been to investigate if any research-based knowledge exist supporting the hypothesis that the interior architectural qualities of eating environments influence patient food intake, health...... and well-being, as well as outline a set of basic design principles ‘predicting’ the future interior architectural qualities of patient eating environments. Methodologically the thesis is based on an explorative study employing an abductive approach and hermeneutic-interpretative strategy utilizing tactics...

  3. Architectural Anthropology

    DEFF Research Database (Denmark)

    Stender, Marie

    -anthropology. Within the field of architecture, however, there has not yet been quite the same eagerness to include anthropological approaches in design processes. This paper discusses why this is so and how and whether architectural anthropology has different conditions and objectives than other types of design...... and other spaces that architects are preoccupied with. On the other hand, the distinction between architecture and design is not merely one of scale. Design and architecture represent – at least in Denmark – also quite different disciplinary traditions and methods. Where designers develop prototypes......, and that this will restrict the creative design process. Also, the end user of architecture is not easily identified, as a new building should not just accommodate the needs of specific residents but also those of neighbours, future residents, other citizens and maybe society as such. The paper explores the challenges...

  4. Architecture is always in the middle…

    Directory of Open Access Journals (Sweden)

    Tim Gough

    2015-12-01

    Full Text Available This essay proposes an ontology of architecture that takes its lead from the bread and butter of architecture: a flat ontology opposed to Cartesianism in the sense that no differentiation between realms (body/mind, high/low is accepted. The work of Spinoza and Deleuze is referred to in order to flesh out such an ontology, whose aim is to destroy the very desire for architecture and architectural theory to even pose the question about the difference between bread-and-butter architecture and high architecture. Architecture is shown to be of the nature of an assemblage, of a machine or a haecceity (to use Deleuze and Guattari’s phrase, and the implications of this in relation to the question of composition and reception are outlined.

  5. The Tera Multithreaded Architecture and Unstructured Meshes

    Science.gov (United States)

    Bokhari, Shahid H.; Mavriplis, Dimitri J.

    1998-01-01

    The Tera Multithreaded Architecture (MTA) is a new parallel supercomputer currently being installed at San Diego Supercomputing Center (SDSC). This machine has an architecture quite different from contemporary parallel machines. The computational processor is a custom design and the machine uses hardware to support very fine grained multithreading. The main memory is shared, hardware randomized and flat. These features make the machine highly suited to the execution of unstructured mesh problems, which are difficult to parallelize on other architectures. We report the results of a study carried out during July-August 1998 to evaluate the execution of EUL3D, a code that solves the Euler equations on an unstructured mesh, on the 2 processor Tera MTA at SDSC. Our investigation shows that parallelization of an unstructured code is extremely easy on the Tera. We were able to get an existing parallel code (designed for a shared memory machine), running on the Tera by changing only the compiler directives. Furthermore, a serial version of this code was compiled to run in parallel on the Tera by judicious use of directives to invoke the "full/empty" tag bits of the machine to obtain synchronization. This version achieves 212 and 406 Mflop/s on one and two processors respectively, and requires no attention to partitioning or placement of data issues that would be of paramount importance in other parallel architectures.

  6. Humanizing Architecture

    DEFF Research Database (Denmark)

    Toft, Tanya Søndergaard

    2015-01-01

    The article proposes the urban digital gallery as an opportunity to explore the relationship between ‘human’ and ‘technology,’ through the programming of media architecture. It takes a curatorial perspective when proposing an ontological shift from considering media facades as visual spectacles...... agency and a sense of being by way of dematerializing architecture. This is achieved by way of programming the symbolic to provide new emotional realizations and situations of enlightenment in the public audience. This reflects a greater potential to humanize the digital in media architecture....

  7. Healing Architecture

    DEFF Research Database (Denmark)

    Folmer, Mette Blicher; Mullins, Michael; Frandsen, Anne Kathrine

    2012-01-01

    The project examines how architecture and design of space in the intensive unit promotes or hinders interaction between relatives and patients. The primary starting point is the relatives. Relatives’ support and interaction with their loved ones is important in order to promote the patients healing...... process. Therefore knowledge on how space can support interaction is fundamental for the architect, in order to make the best design solutions. Several scientific studies document that the hospital's architecture and design are important for human healing processes, including how the physical environment...... architectural and design solutions in order to improve quality of interaction between relative and patient in the hospital's intensive unit....

  8. Architectural technology

    DEFF Research Database (Denmark)

    2005-01-01

    The booklet offers an overall introduction to the Institute of Architectural Technology and its projects and activities, and an invitation to the reader to contact the institute or the individual researcher for further information. The research, which takes place at the Institute of Architectural...... Technology at the Roayl Danish Academy of Fine Arts, School of Architecture, reflects a spread between strategic, goal-oriented pilot projects, commissioned by a ministry, a fund or a private company, and on the other hand projects which originate from strong personal interests and enthusiasm of individual...

  9. Humanizing Architecture

    DEFF Research Database (Denmark)

    Toft, Tanya Søndergaard

    2015-01-01

    The article proposes the urban digital gallery as an opportunity to explore the relationship between ‘human’ and ‘technology,’ through the programming of media architecture. It takes a curatorial perspective when proposing an ontological shift from considering media facades as visual spectacles...... agency and a sense of being by way of dematerializing architecture. This is achieved by way of programming the symbolic to provide new emotional realizations and situations of enlightenment in the public audience. This reflects a greater potential to humanize the digital in media architecture....

  10. Grammatical inference algorithms, routines and applications

    CERN Document Server

    Wieczorek, Wojciech

    2017-01-01

    This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>.

  11. Architectural Mealscapes

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen; Fisker, Anna Marie; Kirkegaard, Poul Henning

    2012-01-01

    the German architect Gottfried Semper developed a theory on the “four elements of Architecture” tracing the origin of architecture back to the rise of the early human settlement and the creation of fire. With the notion ‘hearth’ as the first motive in architecture and the definition of three enclosing...... motives; mounding, enclosure and roof, Semper linked the cultural and social values of the primordial fireplace with the order and shape of architecture. He claimed that any building ever made was nothing but a variation of the first primitive shelters erected around the fireplace, and that the three...... enclosing motives existed only as defenders of the “sacred flame”. In that way Semper developed the idea that any architectural scenery can be described, analyzed and explained by understanding the contextual, symbolic and social values of how the four basic motives of hearth, mounding, enclosure, and roof...

  12. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen; Fisker, Anna Marie; Kirkegaard, Poul Henning

    2013-01-01

    and recovery through the architecture framing eating experiences, this article examines, from a theoretical perspective, two less debated concepts relating to hospitality called food design and architectural theatricality. In architectural theory the nineteenth century German architect Gottfried Semper...... is known for his writings on theatricality, understood as a holistic design approach emphasizing the contextual, cultural, ritual and social meanings rooted in architecture. Relative hereto, the International Food Design Society recently argued, in a similar holistic manner, that the methodology used...... to provide an aesthetic eating experience includes knowledge on both food and design. Based on a hermeneutic reading of Semper’s theory, our thesis is that this holistic design approach is important when debating concepts of hospitality in hospitals. We use this approach to argue for how ‘food design...

  13. Architectural Engineers

    DEFF Research Database (Denmark)

    Petersen, Rikke Premer

    The design professions have always been an amorphous phenomena difficult to merge under one label. New constellations continually emerge, questioning, stretching, and reconfiguring the understanding of design and the professional practices linked to it. In this paper the idea of architectural...... engineering is addresses from two perspectives – as an educational response and an occupational constellation. Architecture and engineering are two of the traditional design professions and they frequently meet in the occupational setting, but at educational institutions they remain largely estranged....... The paper builds on a multi-sited study of an architectural engineering program at the Technical University of Denmark and an architectural engineering team within an international engineering consultancy based on Denmark. They are both responding to new tendencies within the building industry where...

  14. SEMANTIC PATCH INFERENCE

    DEFF Research Database (Denmark)

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set ...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  15. Machine testning

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with a laboratory exercise of 3 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercise includes a series of tests carried out by the student on a conventional and a numerically controled lathe, respectively. This document...

  16. Representational Machines

    DEFF Research Database (Denmark)

    Petersson, Dag; Dahlgren, Anna; Vestberg, Nina Lager

    to the enterprises of the medium. This is the subject of Representational Machines: How photography enlists the workings of institutional technologies in search of establishing new iconic and social spaces. Together, the contributions to this edited volume span historical epochs, social environments, technological...

  17. Machine learning phases of matter

    Science.gov (United States)

    Carrasquilla, Juan; Melko, Roger G.

    2017-02-01

    Condensed-matter physics is the study of the collective behaviour of infinitely complex assemblies of electrons, nuclei, magnetic moments, atoms or qubits. This complexity is reflected in the size of the state space, which grows exponentially with the number of particles, reminiscent of the `curse of dimensionality' commonly encountered in machine learning. Despite this curse, the machine learning community has developed techniques with remarkable abilities to recognize, classify, and characterize complex sets of data. Here, we show that modern machine learning architectures, such as fully connected and convolutional neural networks, can identify phases and phase transitions in a variety of condensed-matter Hamiltonians. Readily programmable through modern software libraries, neural networks can be trained to detect multiple types of order parameter, as well as highly non-trivial states with no conventional order, directly from raw state configurations sampled with Monte Carlo.

  18. Template Matching on Parallel Architectures,

    Science.gov (United States)

    1985-07-01

    memory. The processors run asynchronously. Thus according to Hynn’s categories the Butterfl . is a MIMD machine. The processors of the Butterfly are...Generalized Butterfly Architecture This section describes timings for pattern matching on the generalized Butterfl .. Ihe implementations on the Butterfly...these algorithms. Thus the best implementation of the techniques on the generalized Butterfl % are the same as the implementation on the real Butterfly

  19. Architectural geometry

    KAUST Repository

    Pottmann, Helmut

    2014-11-26

    Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural geometry, meanwhile contains a great wealth of individual contributions which are relevant in various fields. For mathematicians, the relation to discrete differential geometry is significant, in particular the integrable system viewpoint. Besides, new application contexts have become available for quite some old-established concepts. Regarding graphics and geometry processing, architectural geometry yields interesting new questions but also new objects, e.g. replacing meshes by other combinatorial arrangements. Numerical optimization plays a major role but in itself would be powerless without geometric understanding. Summing up, architectural geometry has become a rewarding field of study. We here survey the main directions which have been pursued, we show real projects where geometric considerations have played a role, and we outline open problems which we think are significant for the future development of both theory and practice of architectural geometry.

  20. Adding machine and calculating machine

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    In 1642 the French mathematician Blaise Pascal(1623-1662) invented a machine;.that could add and subtract. It had.wheels that each had: 1 to 10 marked off along its circumference. When the wheel at the right, representing units, made one complete circle, it engaged the wheel to its left, represents tens, and moved it forward one notch.

  1. Inference in `poor` languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  2. Deep Learning for Population Genetic Inference.

    Directory of Open Access Journals (Sweden)

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  3. Deep Learning for Population Genetic Inference.

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  4. Deep Learning for Population Genetic Inference

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  5. Architectural Anthropology

    DEFF Research Database (Denmark)

    Stender, Marie

    , while recent material and spatial turns in anthropology have also brought an increasing interest in design, architecture and the built environment. Understanding the relationship between the social and the physical is at the heart of both disciplines, and they can obviously benefit from further......Architecture and anthropology have always had a common focus on dwelling, housing, urban life and spatial organisation. Current developments in both disciplines make it even more relevant to explore their boundaries and overlaps. Architects are inspired by anthropological insights and methods...... collaboration: How can qualitative anthropological approaches contribute to contemporary architecture? And just as importantly: What can anthropologists learn from architects’ understanding of spatial and material surroundings? Recent theoretical developments in anthropology stress the role of materials...

  6. Architectural Narratives

    DEFF Research Database (Denmark)

    2010-01-01

    In this essay, I focus on the combination of programs and the architecture of cultural projects that have emerged within the last few years. These projects are characterized as “hybrid cultural projects,” because they intend to combine experience with entertainment, play, and learning. This essay...... identifies new rationales related to this development, and it argues that “cultural planning” has increasingly shifted its focus from a cultural institutional approach to a more market-oriented strategy that integrates art and business. The role of architecture has changed, too. It not only provides...... a functional framework for these concepts, but tries increasingly to endow the main idea of the cultural project with a spatially aesthetic expression - a shift towards “experience architecture.” A great number of these projects typically recycle and reinterpret narratives related to historical buildings...

  7. Genesis machines

    CERN Document Server

    Amos, Martyn

    2014-01-01

    Silicon chips are out. Today's scientists are using real, wet, squishy, living biology to build the next generation of computers. Cells, gels and DNA strands are the 'wetware' of the twenty-first century. Much smaller and more intelligent, these organic computers open up revolutionary possibilities. Tracing the history of computing and revealing a brave new world to come, Genesis Machines describes how this new technology will change the way we think not just about computers - but about life itself.

  8. Semantic Vector Machines

    CERN Document Server

    Vincent, Etter

    2011-01-01

    We first present our work in machine translation, during which we used aligned sentences to train a neural network to embed n-grams of different languages into an $d$-dimensional space, such that n-grams that are the translation of each other are close with respect to some metric. Good n-grams to n-grams translation results were achieved, but full sentences translation is still problematic. We realized that learning semantics of sentences and documents was the key for solving a lot of natural language processing problems, and thus moved to the second part of our work: sentence compression. We introduce a flexible neural network architecture for learning embeddings of words and sentences that extract their semantics, propose an efficient implementation in the Torch framework and present embedding results comparable to the ones obtained with classical neural language models, while being more powerful.

  9. Multithreading architecture

    CERN Document Server

    Nemirovsky, Mario

    2013-01-01

    Multithreaded architectures now appear across the entire range of computing devices, from the highest-performing general purpose devices to low-end embedded processors. Multithreading enables a processor core to more effectively utilize its computational resources, as a stall in one thread need not cause execution resources to be idle. This enables the computer architect to maximize performance within area constraints, power constraints, or energy constraints. However, the architectural options for the processor designer or architect looking to implement multithreading are quite extensive and

  10. Machine Learning in Parliament Elections

    Directory of Open Access Journals (Sweden)

    Ahmad Esfandiari

    2012-09-01

    Full Text Available Parliament is considered as one of the most important pillars of the country governance. The parliamentary elections and prediction it, had been considered by scholars of from various field like political science long ago. Some important features are used to model the results of consultative parliament elections. These features are as follows: reputation and popularity, political orientation, tradesmen's support, clergymen's support, support from political wings and the type of supportive wing. Two parameters of reputation and popularity and the support of clergymen and religious scholars that have more impact in reducing of prediction error in election results, have been used as input parameters in implementation. In this study, the Iranian parliamentary elections, modeled and predicted using learnable machines of neural network and neuro-fuzzy. Neuro-fuzzy machine combines the ability of knowledge representation of fuzzy sets and the learning power of neural networks simultaneously. In predicting the social and political behavior, the neural network is first trained by two learning algorithms using the training data set and then this machine predict the result on test data. Next, the learning of neuro-fuzzy inference machine is performed. Then, be compared the results of two machines.

  11. Parallel mutual information estimation for inferring gene regulatory networks on GPUs

    Directory of Open Access Journals (Sweden)

    Liu Weiguo

    2011-06-01

    Full Text Available Abstract Background Mutual information is a measure of similarity between two variables. It has been widely used in various application domains including computational biology, machine learning, statistics, image processing, and financial computing. Previously used simple histogram based mutual information estimators lack the precision in quality compared to kernel based methods. The recently introduced B-spline function based mutual information estimation method is competitive to the kernel based methods in terms of quality but at a lower computational complexity. Results We present a new approach to accelerate the B-spline function based mutual information estimation algorithm with commodity graphics hardware. To derive an efficient mapping onto this type of architecture, we have used the Compute Unified Device Architecture (CUDA programming model to design and implement a new parallel algorithm. Our implementation, called CUDA-MI, can achieve speedups of up to 82 using double precision on a single GPU compared to a multi-threaded implementation on a quad-core CPU for large microarray datasets. We have used the results obtained by CUDA-MI to infer gene regulatory networks (GRNs from microarray data. The comparisons to existing methods including ARACNE and TINGe show that CUDA-MI produces GRNs of higher quality in less time. Conclusions CUDA-MI is publicly available open-source software, written in CUDA and C++ programming languages. It obtains significant speedup over sequential multi-threaded implementation by fully exploiting the compute capability of commonly used CUDA-enabled low-cost GPUs.

  12. Self assembly of interlocked architectures

    CERN Document Server

    Schergna, S

    2002-01-01

    An area of great interest is the synthesis and characterisation of molecules possessing moving parts, with the goal that they can act as 'molecular machine' carrying out tasks that molecules with fixed conventional architectures cannot do. Rotaxanes and catenanes (mechanically interlocked architectures) represent one approach toward achieving these aims as their component wheels and / or threads are connected together but can still move, in certain, controlled directions. This thesis focused on the study of structural rigidity and the preorganisation of thread binding sites as factors of major influence on template efficiency in the synthesis of hydrogen bond assembled supramolecular structures (rotaxanes and catenanes). Chapter One gives a brief outline of the common synthetic approaches to interlocked architectures (catenanes and rotaxanes) that are now being developed to address the problems outlined above. Chapter Two and Chapter Three concerns the synthesis of novel amide-based rotaxanes containing vario...

  13. Kinematic Analysis of a New Parallel Machine Tool: the Orthoglide

    CERN Document Server

    Wenger, Philippe

    2007-01-01

    This paper describes a new parallel kinematic architecture for machining applications: the orthoglide. This machine features three fixed parallel linear joints which are mounted orthogonally and a mobile platform which moves in the Cartesian x-y-z space with fixed orientation. The main interest of the orthoglide is that it takes benefit from the advantages of the popular PPP serial machines (regular Cartesian workspace shape and uniform performances) as well as from the parallel kinematic arrangement of the links (less inertia and better dynamic performances), which makes the orthoglide well suited to high-speed machining applications. Possible extension of the orthoglide to 5-axis machining is also investigated.

  14. A New Three-DOF Parallel Mechanism: Milling Machine Applications

    CERN Document Server

    Chablat, Damien

    2000-01-01

    This paper describes a new parallel kinematic architecture for machining applications, namely, the orthoglide. This machine features three fixed parallel linear joints which are mounted orthogonally and a mobile platform which moves in the Cartesian x-y-z space with fixed orientation. The main interest of the orthoglide is that it takes benefit from the advantages of the popular PPP serial machines (regular Cartesian workspace shape and uniform performances) as well as from the parallel kinematic arrangement of the links (less inertia and better dynamic performances), which makes the orthoglide well suited to high-speed machining applications. Possible extension of the orthoglide to 5-axis machining is also investigated.

  15. Reusable State Machine Code Generator

    Science.gov (United States)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  16. Textile Architecture

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen

    2010-01-01

    Textiles can be used as building skins, adding new aesthetic and functional qualities to architecture. Just like we as humans can put on a coat, buildings can also get dressed. Depending on our mood, or on the weather, we can change coat, and so can the building. But the idea of using textiles...

  17. Architectural Tops

    Science.gov (United States)

    Mahoney, Ellen

    2010-01-01

    The development of the skyscraper is an American story that combines architectural history, economic power, and technological achievement. Each city in the United States can be identified by the profile of its buildings. The design of the tops of skyscrapers was the inspiration for the students in the author's high-school ceramic class to develop…

  18. Architectural Mealscapes

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen; Fisker, Anna Marie; Kirkegaard, Poul Henning

    2012-01-01

    are jointed together. The purpose of this paper has therefore been to test the idea of a new paradigm for ‘Interior Design for Food’ taking into account the reflective perspective and critical thinking of architectural theory like for instance developed with Semper, when studying the eating environment...

  19. Architectural Illusion.

    Science.gov (United States)

    Doornek, Richard R.

    1990-01-01

    Presents a lesson plan developed around the work of architectural muralist Richard Haas. Discusses the significance of mural painting and gives key concepts for the lesson. Lists class activities for the elementary and secondary grades. Provides a photograph of the Haas mural on the Fountainbleau Hilton Hotel, 1986. (GG)

  20. Religious Architecture

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The main religions of ancient China were Buddhism,Taoism and Islam, of which Buddhism was the most widespread. As a result, Buddhist temples and towers are found all over China, and have become important components of the country's ancient architecture.

  1. Textile Architecture

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen

    2010-01-01

    Textiles can be used as building skins, adding new aesthetic and functional qualities to architecture. Just like we as humans can put on a coat, buildings can also get dressed. Depending on our mood, or on the weather, we can change coat, and so can the building. But the idea of using textiles...

  2. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  3. Environmentally Friendly Machining

    CERN Document Server

    Dixit, U S; Davim, J Paulo

    2012-01-01

    Environment-Friendly Machining provides an in-depth overview of environmentally-friendly machining processes, covering numerous different types of machining in order to identify which practice is the most environmentally sustainable. The book discusses three systems at length: machining with minimal cutting fluid, air-cooled machining and dry machining. Also covered is a way to conserve energy during machining processes, along with useful data and detailed descriptions for developing and utilizing the most efficient modern machining tools. Researchers and engineers looking for sustainable machining solutions will find Environment-Friendly Machining to be a useful volume.

  4. Mocapy++ - a toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  5. Machine Transliteration

    CERN Document Server

    Knight, K; Knight, Kevin; Graehl, Jonathan

    1997-01-01

    It is challenging to translate names and technical terms across languages with different alphabets and sound inventories. These items are commonly transliterated, i.e., replaced with approximate phonetic equivalents. For example, "computer" in English comes out as "konpyuutaa" in Japanese. Translating such items from Japanese back to English is even more challenging, and of practical interest, as transliterated items make up the bulk of text phrases not found in bilingual dictionaries. We describe and evaluate a method for performing backwards transliterations by machine. This method uses a generative model, incorporating several distinct stages in the transliteration process.

  6. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  7. High-Tech Architecture

    Directory of Open Access Journals (Sweden)

    Özlem Eşsiz

    1999-05-01

    Full Text Available The technological development in building and construction area, bring with the new construction systems and the new products. The aim of this study is to define the High Tech concept, and set the common and basic characteristics of High Tech applications. During 1970’s High Tech was born and developed in Britain. Especially British Architects Richard Rogers, Michael Hopkins, Norman Foster, Nicholas Grimshaw and Ian Ritchie are the leaders of this style. Their architecture show the machine aesthetic and use of industrial revoluation materials such as glass and steel. The reasons for wide usage of this technology in building constructions are; the ease of renewing the structural and installation systems by the changing technology and giving monumentality to the prestige buildings. High Tech building which we have many examples of give their occupants a lot of opportunities and also they can adapt itself to the time.

  8. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  9. MUF architecture /art London

    DEFF Research Database (Denmark)

    Svenningsen Kajita, Heidi

    2009-01-01

    Om MUF architecture samt interview med Liza Fior og Katherine Clarke, partnere i muf architecture/art......Om MUF architecture samt interview med Liza Fior og Katherine Clarke, partnere i muf architecture/art...

  10. Machine Protection

    CERN Document Server

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an ...

  11. A Bayesian least squares support vector machines based framework for fault diagnosis and failure prognosis

    Science.gov (United States)

    Khawaja, Taimoor Saleem

    and any abnormal or novel data during real-time operation. The results of the scheme are interpreted as a posterior probability of health (1 - probability of fault). As shown through two case studies in Chapter 3, the scheme is well suited for diagnosing imminent faults in dynamical non-linear systems. Finally, the failure prognosis scheme is based on an incremental weighted Bayesian LS-SVR machine. It is particularly suited for online deployment given the incremental nature of the algorithm and the quick optimization problem solved in the LS-SVR algorithm. By way of kernelization and a Gaussian Mixture Modeling (GMM) scheme, the algorithm can estimate "possibly" non-Gaussian posterior distributions for complex non-linear systems. An efficient regression scheme associated with the more rigorous core algorithm allows for long-term predictions, fault growth estimation with confidence bounds and remaining useful life (RUL) estimation after a fault is detected. The leading contributions of this thesis are (a) the development of a novel Bayesian Anomaly Detector for efficient and reliable Fault Detection and Identification (FDI) based on Least Squares Support Vector Machines, (b) the development of a data-driven real-time architecture for long-term Failure Prognosis using Least Squares Support Vector Machines, (c) Uncertainty representation and management using Bayesian Inference for posterior distribution estimation and hyper-parameter tuning, and finally (d) the statistical characterization of the performance of diagnosis and prognosis algorithms in order to relate the efficiency and reliability of the proposed schemes.

  12. Multilanguage parallel programming of heterogeneous machines

    Energy Technology Data Exchange (ETDEWEB)

    Bisiani, R.; Forin, A.

    1988-08-01

    The authors designed and implemented a system, Agora, that supports the development of multilanguage parallel applications for heterogeneous machines. Agora hinges on two ideas: the first one is that shared memory can be a suitable abstraction to program concurrent, multilanguage modules running on heterogeneous machines. The second one is that a shared memory abstraction can efficiently supported across different computer architectures that are not connected by a physical shared memory, for example local are network workstations or ensemble machines. Agora has been in use for more than a year. This paper describes the Agora shared memory and its software implementation on both tightly and loosely coupled architectures. Measurements of the current implementation are also included.

  13. Catalyst Architecture

    DEFF Research Database (Denmark)

    the projects as case studies, which contribute with strategic knowledge rather than generalizing from average considerations. These are ‘strategic projects’ where we have looked for the specific and the particular (Flyvbjerg 1991). According to the case studies, we use the case study method developed by Bent......’ interpretations and architectural strategies are included in the analyses. This implies that there is a large variation of empirical knowledge about the selected problems. That is the reason why we give a short introduction to the exact use of approaches and methods in the beginning of each case study. Based...... in experience? Which design qualities do the best examples of architecture as urban catalysts have, and how can we as citizens, politicians and professionals use knowledge about this in the development of our cities as good places to live? We wish to throw light on these key questions through case studies...

  14. Kosmos = architecture

    Directory of Open Access Journals (Sweden)

    Tine Kurent

    1985-12-01

    Full Text Available The old Greek word "kosmos" means not only "cosmos", but also "the beautiful order", "the way of building", "building", "scenography", "mankind", and, in the time of the New Testament, also "pagans". The word "arhitekton", meaning first the "master of theatrical scenography", acquired the meaning of "builder", when the words "kosmos" and ~kosmetes" became pejorative. The fear that architecture was not considered one of the arts before Renaissance, since none of the Muses supervised the art of building, results from the misunderstanding of the word "kosmos". Urania was the Goddes of the activity implied in the verb "kosmein", meaning "to put in the beautiful order" - everything, from the universe to the man-made space, i. e. the architecture.

  15. Architectural Engineers

    DEFF Research Database (Denmark)

    Petersen, Rikke Premer

    The design professions have always been an amorphous phenomena difficult to merge under one label. New constellations continually emerge, questioning, stretching, and reconfiguring the understanding of design and the professional practices linked to it. In this paper the idea of architectural eng...... with new types of competences and be able to manoeuvre in new types of constellations, but concurrently core competences must be preserved and the time of study kept at a minimum....

  16. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen

    putting an emphasis on architecture as unified scenery guided by the four motives hearth, enclosure, dressing and context. This theoretical framework draws on the Gastronomic Analogy put forth by James Fergusson in 1862 and an interpretation of the writings of the 19th century architect Gottfried Semper...... with the material appearance of objects, but also the imaginary world of dreams and memories which are concealed with the communicative significance of intentions when designing the future super hospitals....

  17. Pi: A Parallel Architecture Interface for Multi-Model Execution

    Science.gov (United States)

    1990-07-01

    Directory Schemes for Cache Coherence. In The 15th Annual Interna- tional Symposium on Computer Architecture. IEEE Computer Society and ACM, June 1988. [3...Annual International Symposium on Computer Architecture. IEEE Computer Society and ACM, June 1986. [5] Arvind and Rishiyur S. Nikhil. A Dataflow...Overview, 1987. [9] Roberto Bisiani and Alessandro Forin. Multilanguage Parallel Programming of Heterogeneous Machines. IEEE Transactions on Computers

  18. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    Science.gov (United States)

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  19. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    Science.gov (United States)

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  20. Advanced customization in architectural design and construction

    CERN Document Server

    Naboni, Roberto

    2015-01-01

    This book presents the state of the art in advanced customization within the sector of architectural design and construction, explaining important new technologies that are boosting design, product and process innovation and identifying the challenges to be confronted as we move toward a mass customization construction industry. Advanced machinery and software integration are discussed, as well as an overview of the manufacturing techniques offered through digital methods that are acquiring particular significance within the field of digital architecture. CNC machining, Robotic Fabrication, and Additive Manufacturing processes are all clearly explained, highlighting their ability to produce personalized architectural forms and unique construction components. Cutting-edge case studies in digitally fabricated architectural realizations are described and, looking towards the future, a new model of 100% customized architecture for design and construction is presented. The book is an excellent guide to the profoun...

  1. Learning Extended Finite State Machines

    Science.gov (United States)

    Cassel, Sofia; Howar, Falk; Jonsson, Bengt; Steffen, Bernhard

    2014-01-01

    We present an active learning algorithm for inferring extended finite state machines (EFSM)s, combining data flow and control behavior. Key to our learning technique is a novel learning model based on so-called tree queries. The learning algorithm uses the tree queries to infer symbolic data constraints on parameters, e.g., sequence numbers, time stamps, identifiers, or even simple arithmetic. We describe sufficient conditions for the properties that the symbolic constraints provided by a tree query in general must have to be usable in our learning model. We have evaluated our algorithm in a black-box scenario, where tree queries are realized through (black-box) testing. Our case studies include connection establishment in TCP and a priority queue from the Java Class Library.

  2. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  3. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  4. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  5. The Warp computer: Architecture, implementation, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Annaratone, M.; Arnould, E.; Gross, T.; Kung, H.T.; Lam, M.; Menzilcioglu, O.; Webb, J.A.

    1987-12-01

    The Warp machine is a systolic array computer of linearly connected cells, each of which is a programmable processor capable of performing 10 million floating-point operations per second (10 MFLOPS). A typical Warp array includes ten cells, thus having a peak computation rate of 100 MFLOPS. The Warp array can be extended to include more cells to accommodate applications capable of using the increased computational bandwidth. Warp is integrated as an attached processor into a Unix host system. Programs for Warp are written in a high-level language supported by an optimizing complier. This paper describes the architecture, implementation, and performance of the Warp machine. Each major architectural decision is discussed and evaluated with system, software, and application considerations. The programming model and tools developed for the machine are also described. The paper concludes with performance data for a large number of applications.

  6. Temperature based Restricted Boltzmann Machines.

    Science.gov (United States)

    Li, Guoqi; Deng, Lei; Xu, Yi; Wen, Changyun; Wang, Wei; Pei, Jing; Shi, Luping

    2016-01-13

    Restricted Boltzmann machines (RBMs), which apply graphical models to learning probability distribution over a set of inputs, have attracted much attention recently since being proposed as building blocks of multi-layer learning systems called deep belief networks (DBNs). Note that temperature is a key factor of the Boltzmann distribution that RBMs originate from. However, none of existing schemes have considered the impact of temperature in the graphical model of DBNs. In this work, we propose temperature based restricted Boltzmann machines (TRBMs) which reveals that temperature is an essential parameter controlling the selectivity of the firing neurons in the hidden layers. We theoretically prove that the effect of temperature can be adjusted by setting the parameter of the sharpness of the logistic function in the proposed TRBMs. The performance of RBMs can be improved by adjusting the temperature parameter of TRBMs. This work provides a comprehensive insights into the deep belief networks and deep learning architectures from a physical point of view.

  7. Nanorobot architecture for medical target identification

    Science.gov (United States)

    Cavalcanti, Adriano; Shirinzadeh, Bijan; Freitas, Robert A., Jr.; Hogg, Tad

    2008-01-01

    This work has an innovative approach for the development of nanorobots with sensors for medicine. The nanorobots operate in a virtual environment comparing random, thermal and chemical control techniques. The nanorobot architecture model has nanobioelectronics as the basis for manufacturing integrated system devices with embedded nanobiosensors and actuators, which facilitates its application for medical target identification and drug delivery. The nanorobot interaction with the described workspace shows how time actuation is improved based on sensor capabilities. Therefore, our work addresses the control and the architecture design for developing practical molecular machines. Advances in nanotechnology are enabling manufacturing nanosensors and actuators through nanobioelectronics and biologically inspired devices. Analysis of integrated system modeling is one important aspect for supporting nanotechnology in the fast development towards one of the most challenging new fields of science: molecular machines. The use of 3D simulation can provide interactive tools for addressing nanorobot choices on sensing, hardware architecture design, manufacturing approaches, and control methodology investigation.

  8. Nanorobot architecture for medical target identification

    Energy Technology Data Exchange (ETDEWEB)

    Cavalcanti, Adriano [CAN Center for Automation in Nanobiotech, Melbourne VIC 3168 (Australia); Shirinzadeh, Bijan [Robotics and Mechatronics Research Laboratory, Department of Mechanical Engineering, Monash University, Clayton, Melbourne VIC 3800 (Australia); Freita, Robert A Jr [Institute for Molecular Manufacturing, Pilot Hill, CA 95664 (United States); Hogg, Tad [Hewlett-Packard Laboratories, Palo Alto, CA 94304 (United States)

    2008-01-09

    This work has an innovative approach for the development of nanorobots with sensors for medicine. The nanorobots operate in a virtual environment comparing random, thermal and chemical control techniques. The nanorobot architecture model has nanobioelectronics as the basis for manufacturing integrated system devices with embedded nanobiosensors and actuators, which facilitates its application for medical target identification and drug delivery. The nanorobot interaction with the described workspace shows how time actuation is improved based on sensor capabilities. Therefore, our work addresses the control and the architecture design for developing practical molecular machines. Advances in nanotechnology are enabling manufacturing nanosensors and actuators through nanobioelectronics and biologically inspired devices. Analysis of integrated system modeling is one important aspect for supporting nanotechnology in the fast development towards one of the most challenging new fields of science: molecular machines. The use of 3D simulation can provide interactive tools for addressing nanorobot choices on sensing, hardware architecture design, manufacturing approaches, and control methodology investigation.

  9. Machine learning methods for planning

    CERN Document Server

    Minton, Steven

    1993-01-01

    Machine Learning Methods for Planning provides information pertinent to learning methods for planning and scheduling. This book covers a wide variety of learning methods and learning architectures, including analogical, case-based, decision-tree, explanation-based, and reinforcement learning.Organized into 15 chapters, this book begins with an overview of planning and scheduling and describes some representative learning systems that have been developed for these tasks. This text then describes a learning apprentice for calendar management. Other chapters consider the problem of temporal credi

  10. Architectural dreaming

    Institute of Scientific and Technical Information of China (English)

    Mark Godfrey

    2004-01-01

    <正> For the first 800 years of its existence, Beijing retained essentially the same character: a walled palace city at its centre, organised on a strict north-south axis and contained within a sea of courtyard houses along lanes too narrow forcars. Today Beijing’s basic unit of architectural scale has become the skyscraper, erected a dozen at a time and facing massive highways. Ancient buildings are so small and unloved by comparison that they threaten to disappear from view. Fuelled by an economic boom, and propelled by banks flushed with mortgage cash, Beijing has embarked on the largest building campaign the world has ever seen.

  11. Catalyst Architecture

    DEFF Research Database (Denmark)

    Kiib, Hans; Marling, Gitte; Hansen, Peter Mandal

    2014-01-01

    of programs, have a role in mediating positive social and/or cultural development. In this sense, we talk about architecture as a catalyst for: sustainable adaptation of the city’s infrastructure appropriate renovation of dilapidated urban districts strengthening of social cohesiveness in the city development...... meaningful for everyone. The exhibited works are designed by SANAA, Diller Scofidio + Renfro, James Corner Field Operation, JBMC Arquitetura e Urbanismo, Atelier Bow-Wow, Ateliers Jean Nouvel, COBE, Transform, BIG, Topotek1, Superflex, and by visual artist Jane Maria Petersen....

  12. The Bayes Inference Engine

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.

    1996-04-01

    The authors are developing a computer application, called the Bayes Inference Engine, to provide the means to make inferences about models of physical reality within a Bayesian framework. The construction of complex nonlinear models is achieved by a fully object-oriented design. The models are represented by a data-flow diagram that may be manipulated by the analyst through a graphical programming environment. Maximum a posteriori solutions are achieved using a general, gradient-based optimization algorithm. The application incorporates a new technique of estimating and visualizing the uncertainties in specific aspects of the model.

  13. Foundations of Inference

    CERN Document Server

    Knuth, Kevin H

    2010-01-01

    We present a foundation for inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying finite lattices of logical statements in a way that satisfies general lattice symmetries. With other applications in mind, our derivations assume minimal symmetries, relying on neither complementarity nor continuity or differentiability. Each relevant symmetry corresponds to an axiom of quantification, and these axioms are used to derive a unique set of rules governing quantification of the lattice. These rules form the familiar probability calculus. We also derive a unique quantification of divergence and information. Taken together these results form a simple and clear foundation for the quantification of inference.

  14. Making Type Inference Practical

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. Abo....... Experiments indicate that the implementation type checks as much as 100 lines pr. second. This results in a mature product, on which a number of tools can be based, for example a safety tool, an image compression tool, a code optimization tool, and an annotation tool. This may make type inference for object...

  15. Automation of printing machine

    OpenAIRE

    Sušil, David

    2016-01-01

    Bachelor thesis is focused on the automation of the printing machine and comparing the two types of printing machines. The first chapter deals with the history of printing, typesettings, printing techniques and various kinds of bookbinding. The second chapter describes the difference between sheet-fed printing machines and offset printing machines, the difference between two representatives of rotary machines, technological process of the products on these machines, the description of the mac...

  16. Advanced Electrical Machines and Machine-Based Systems for Electric and Hybrid Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Cheng

    2015-09-01

    Full Text Available The paper presents a number of advanced solutions on electric machines and machine-based systems for the powertrain of electric vehicles (EVs. Two types of systems are considered, namely the drive systems designated to the EV propulsion and the power split devices utilized in the popular series-parallel hybrid electric vehicle architecture. After reviewing the main requirements for the electric drive systems, the paper illustrates advanced electric machine topologies, including a stator permanent magnet (stator-PM motor, a hybrid-excitation motor, a flux memory motor and a redundant motor structure. Then, it illustrates advanced electric drive systems, such as the magnetic-geared in-wheel drive and the integrated starter generator (ISG. Finally, three machine-based implementations of the power split devices are expounded, built up around the dual-rotor PM machine, the dual-stator PM brushless machine and the magnetic-geared dual-rotor machine. As a conclusion, the development trends in the field of electric machines and machine-based systems for EVs are summarized.

  17. Machine musicianship

    Science.gov (United States)

    Rowe, Robert

    2002-05-01

    The training of musicians begins by teaching basic musical concepts, a collection of knowledge commonly known as musicianship. Computer programs designed to implement musical skills (e.g., to make sense of what they hear, perform music expressively, or compose convincing pieces) can similarly benefit from access to a fundamental level of musicianship. Recent research in music cognition, artificial intelligence, and music theory has produced a repertoire of techniques that can make the behavior of computer programs more musical. Many of these were presented in a recently published book/CD-ROM entitled Machine Musicianship. For use in interactive music systems, we are interested in those which are fast enough to run in real time and that need only make reference to the material as it appears in sequence. This talk will review several applications that are able to identify the tonal center of musical material during performance. Beyond this specific task, the design of real-time algorithmic listening through the concurrent operation of several connected analyzers is examined. The presentation includes discussion of a library of C++ objects that can be combined to perform interactive listening and a demonstration of their capability.

  18. Inference as Prediction

    Science.gov (United States)

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  19. From green architecture to architectural green

    DEFF Research Database (Denmark)

    Earon, Ofri

    2011-01-01

    . Architectural green could signify green architecture with inclusive interrelations between green and space, built and unbuilt, inside and outside. The aim of the term is to reflect a new focus in green architecture – its architectural performance. Ecological issues are not underestimated or ignored, but so far...... they have overshadowed the architectural potential of green architecture. The paper questions how a green space should perform, look like and function. Two examples are chosen to demonstrate thorough integrations between green and space. The examples are public buildings categorized as pavilions. One...... is not limited to the architecture of pavilions and can be applied in other architectural forms and functions. The paper ends by questioning the potential of architectural green in urbanity....

  20. Machine learning methods for nanolaser characterization

    CERN Document Server

    Zibar, Darko; Winther, Ole; Moerk, Jesper; Schaeffer, Christian

    2016-01-01

    Nanocavity lasers, which are an integral part of an on-chip integrated photonic network, are setting stringent requirements on the sensitivity of the techniques used to characterize the laser performance. Current characterization tools cannot provide detailed knowledge about nanolaser noise and dynamics. In this progress article, we will present tools and concepts from the Bayesian machine learning and digital coherent detection that offer novel approaches for highly-sensitive laser noise characterization and inference of laser dynamics. The goal of the paper is to trigger new research directions that combine the fields of machine learning and nanophotonics for characterizing nanolasers and eventually integrated photonic networks

  1. Electrical machines mathematical fundamentals of machine topologies

    CERN Document Server

    Gerling, Dieter

    2015-01-01

    Electrical Machines and Drives play a powerful role in industry with an ever increasing importance. This fact requires the understanding of machine and drive principles by engineers of many different disciplines. Therefore, this book is intended to give a comprehensive deduction of these principles. Special attention is given to the precise mathematical derivation of the necessary formulae to calculate machines and drives and to the discussion of simplifications (if applied) with the associated limits. The book shows how the different machine topologies can be deduced from general fundamentals, and how they are linked together. This book addresses graduate students, researchers, and developers of Electrical Machines and Drives, who are interested in getting knowledge about the principles of machine and drive operation and in detecting the mathematical and engineering specialties of the different machine and drive topologies together with their mutual links. The detailed - but nevertheless compact - mat...

  2. Lab architecture

    Science.gov (United States)

    Crease, Robert P.

    2008-04-01

    There are few more dramatic illustrations of the vicissitudes of laboratory architecturethan the contrast between Building 20 at the Massachusetts Institute of Technology (MIT) and its replacement, the Ray and Maria Stata Center. Building 20 was built hurriedly in 1943 as temporary housing for MIT's famous Rad Lab, the site of wartime radar research, and it remained a productive laboratory space for over half a century. A decade ago it was demolished to make way for the Stata Center, an architecturally striking building designed by Frank Gehry to house MIT's computer science and artificial intelligence labs (above). But in 2004 - just two years after the Stata Center officially opened - the building was criticized for being unsuitable for research and became the subject of still ongoing lawsuits alleging design and construction failures.

  3. Laser machining of advanced materials

    CERN Document Server

    Dahotre, Narendra B

    2011-01-01

    Advanced materialsIntroductionApplicationsStructural ceramicsBiomaterials CompositesIntermetallicsMachining of advanced materials IntroductionFabrication techniquesMechanical machiningChemical Machining (CM)Electrical machiningRadiation machining Hybrid machiningLaser machiningIntroductionAbsorption of laser energy and multiple reflectionsThermal effectsLaser machining of structural ceramicsIntrodu

  4. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    production. In Kafka: Toward a Minor Literature, Deleuze and Guatari gave the most comprehensive explanation to the abstract machine in the work of art. Like the war-machines of Virilio, the Kafka-machine operates in three gears or speeds. Furthermore, the machine is connected to spatial diagrams...

  5. Machine learning analysis of binaural rowing sounds

    DEFF Research Database (Denmark)

    Johard, Leonard; Ruffaldi, Emanuele; Hoffmann, Pablo F.

    2011-01-01

    Techniques for machine hearing are increasing their potentiality due to new application domains. In this work we are addressing the analysis of rowing sounds in natural context for the purpose of supporting a training system based on virtual environments. This paper presents the acquisition metho...... methodology and the evaluation of different machine learning techniques for classifying rowing-sound data. We see that a combination of principal component analysis and shallow networks perform equally well as deep architectures, while being much faster to train....

  6. Machine Learning Analysis of Binaural Rowing Sounds

    Directory of Open Access Journals (Sweden)

    Filippeschi Alessandro

    2011-12-01

    Full Text Available Techniques for machine hearing are increasing their potentiality due to new application domains. In this work we are addressing the analysis of rowing sounds in natural context for the purpose of supporting a training system based on virtual environments. This paper presents the acquisition methodology and the evaluation of different machine learning techniques for classifying rowing-sound data. We see that a combination of principal component analysis and shallow networks perform equally well as deep architectures, while being much faster to train.

  7. Unorganized machines for seasonal streamflow series forecasting.

    Science.gov (United States)

    Siqueira, Hugo; Boccato, Levy; Attux, Romis; Lyra, Christiano

    2014-05-01

    Modern unorganized machines--extreme learning machines and echo state networks--provide an elegant balance between processing capability and mathematical simplicity, circumventing the difficulties associated with the conventional training approaches of feedforward/recurrent neural networks (FNNs/RNNs). This work performs a detailed investigation of the applicability of unorganized architectures to the problem of seasonal streamflow series forecasting, considering scenarios associated with four Brazilian hydroelectric plants and four distinct prediction horizons. Experimental results indicate the pertinence of these models to the focused task.

  8. Decoding the architectural theory

    Institute of Scientific and Technical Information of China (English)

    Gu Mengchao

    2008-01-01

    Starting from the illustration of the definition and concept of the architectural theory, the author established his unique understanding about the framework of the architectural theory and the innovation of the architectural theory underlined by Chinese characteristics.

  9. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  10. Russell and Humean Inferences

    Directory of Open Access Journals (Sweden)

    João Paulo Monteiro

    2001-12-01

    Full Text Available Russell's The Problems of Philosophy tries to establish a new theory of induction, at the same time that Hume is there accused of an irrational/ scepticism about induction". But a careful analysis of the theory of knowledge explicitly acknowledged by Hume reveals that, contrary to the standard interpretation in the XXth century, possibly influenced by Russell, Hume deals exclusively with causal inference (which he never classifies as "causal induction", although now we are entitled to do so, never with inductive inference in general, mainly generalizations about sensible qualities of objects ( whether, e.g., "all crows are black" or not is not among Hume's concerns. Russell's theories are thus only false alternatives to Hume's, in (1912 or in his (1948.

  11. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  12. INFERENCES FROM ROSSI TRACES

    Energy Technology Data Exchange (ETDEWEB)

    KENNETH M. HANSON; JANE M. BOOKER

    2000-09-08

    The authors an uncertainty analysis of data taken using the Rossi technique, in which the horizontal oscilloscope sweep is driven sinusoidally in time ,while the vertical axis follows the signal amplitude. The analysis is done within a Bayesian framework. Complete inferences are obtained by tilting the Markov chain Monte Carlo technique, which produces random samples from the posterior probability distribution expressed in terms of the parameters.

  13. Inferring Microbial Fitness Landscapes

    Science.gov (United States)

    2016-02-25

    experiments on evolving microbial populations. Although these experiments have produced examples of remarkable phenomena – e.g. the emergence of mutator...what specific mutations, avian influenza viruses will adapt to novel human hosts; or how readily infectious bacteria will escape antibiotics or the...infer from data the determinants of microbial evolution with sufficient resolution that we can quantify 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  14. SUSTAINABLE ARCHITECTURE : WHAT ARCHITECTURE STUDENTS THINK

    OpenAIRE

    SATWIKO, PRASASTO

    2013-01-01

    Sustainable architecture has become a hot issue lately as the impacts of climate change become more intense. Architecture educations have responded by integrating knowledge of sustainable design in their curriculum. However, in the real life, new buildings keep coming with designs that completely ignore sustainable principles. This paper discusses the results of two national competitions on sustainable architecture targeted for architecture students (conducted in 2012 and 2013). The results a...

  15. Continuous Integrated Invariant Inference Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  16. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  17. Lightweight enterprise architectures

    CERN Document Server

    Theuerkorn, Fenix

    2004-01-01

    STATE OF ARCHITECTUREArchitectural ChaosRelation of Technology and Architecture The Many Faces of Architecture The Scope of Enterprise Architecture The Need for Enterprise ArchitectureThe History of Architecture The Current Environment Standardization Barriers The Need for Lightweight Architecture in the EnterpriseThe Cost of TechnologyThe Benefits of Enterprise Architecture The Domains of Architecture The Gap between Business and ITWhere Does LEA Fit? LEA's FrameworkFrameworks, Methodologies, and Approaches The Framework of LEATypes of Methodologies Types of ApproachesActual System Environmen

  18. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  19. Modeling Architectural Patterns Using Architectural Primitives

    NARCIS (Netherlands)

    Zdun, Uwe; Avgeriou, Paris

    2005-01-01

    Architectural patterns are a key point in architectural documentation. Regrettably, there is poor support for modeling architectural patterns, because the pattern elements are not directly matched by elements in modeling languages, and, at the same time, patterns support an inherent variability that

  20. Modeling Architectural Patterns Using Architectural Primitives

    NARCIS (Netherlands)

    Zdun, Uwe; Avgeriou, Paris

    2005-01-01

    Architectural patterns are a key point in architectural documentation. Regrettably, there is poor support for modeling architectural patterns, because the pattern elements are not directly matched by elements in modeling languages, and, at the same time, patterns support an inherent variability that

  1. Probabilistic Inferences in Bayesian Networks

    OpenAIRE

    Ding, Jianguo

    2010-01-01

    This chapter summarizes the popular inferences methods in Bayesian networks. The results demonstrates that the evidence can propagated across the Bayesian networks by any links, whatever it is forward or backward or intercausal style. The belief updating of Bayesian networks can be obtained by various available inference techniques. Theoretically, exact inferences in Bayesian networks is feasible and manageable. However, the computing and inference is NP-hard. That means, in applications, in ...

  2. Multimodel inference and adaptive management

    Science.gov (United States)

    Rehme, S.E.; Powell, L.A.; Allen, C.R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  3. Machine Protection: Availability for Particle Accelerators

    CERN Document Server

    Apollonio, Andrea; Schmidt, Ruediger

    2015-03-16

    Machine availability is a key indicator for the performance of the next generation of particle accelerators. Availability requirements need to be carefully considered during the design phase to achieve challenging objectives in different fields, as e.g. particle physics and material science. For existing and future High-Power facilities, such as ESS (European Spallation Source) and HL-LHC (High-Luminosity LHC), operation with unprecedented beam power requires highly dependable Machine Protection Systems (MPS) to avoid any damage-induced downtime. Due to the high complexity of accelerator systems, finding the optimal balance between equipment safety and accelerator availability is challenging. The MPS architecture, as well as the choice of electronic components, have a large influence on the achievable level of availability. In this thesis novel methods to address the availability of accelerators and their protection systems are presented. Examples of studies related to dependable MPS architectures are given i...

  4. Measure Transformer Semantics for Bayesian Machine Learning

    Science.gov (United States)

    Borgström, Johannes; Gordon, Andrew D.; Greenberg, Michael; Margetson, James; van Gael, Jurgen

    The Bayesian approach to machine learning amounts to inferring posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define combinators for measure transformers, based on theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zero-probability events. We compile our core language to a small imperative language that has a straightforward semantics via factor graphs, data structures that enable many efficient inference algorithms. We use an existing inference engine for efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models.

  5. A novel architecture for information retrieval system based on semantic web

    Science.gov (United States)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  6. Evaluating current processors performance and machines stability

    CERN Document Server

    Esposito, R; Tortone, G; Taurino, F M

    2003-01-01

    Accurately estimate performance of currently available processors is becoming a key activity, particularly in HENP environment, where high computing power is crucial. This document describes the methods and programs, opensource or freeware, used to benchmark processors, memory and disk subsystems and network connection architectures. These tools are also useful to stress test new machines, before their acquisition or before their introduction in a production environment, where high uptimes are requested.

  7. Machine learning for identifying botnet network traffic

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2013-01-01

    . Due to promise of non-invasive and resilient detection, botnet detection based on network traffic analysis has drawn a special attention of the research community. Furthermore, many authors have turned their attention to the use of machine learning algorithms as the mean of inferring botnet......-related knowledge from the monitored traffic. This paper presents a review of contemporary botnet detection methods that use machine learning as a tool of identifying botnet-related traffic. The main goal of the paper is to provide a comprehensive overview on the field by summarizing current scientific efforts....... The contribution of the paper is three-fold. First, the paper provides a detailed insight on the existing detection methods by investigating which bot-related heuristic were assumed by the detection systems and how different machine learning techniques were adapted in order to capture botnet-related knowledge...

  8. Machine learning a Bayesian and optimization perspective

    CERN Document Server

    Theodoridis, Sergios

    2015-01-01

    This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...

  9. Untyped Memory in the Java Virtual Machine

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    We have implemented a virtual execution environment that executes legacy binary code on top of the type-safe Java Virtual Machine by recompiling native code instructions to type-safe bytecode. As it is essentially impossible to infer static typing into untyped machine code, our system emulates...... untyped memory on top of Java’s type system. While this approach allows to execute native code on any off-the-shelf JVM, the resulting runtime performance is poor. We propose a set of virtual machine extensions that add type-unsafe memory objects to JVM. We contend that these JVM extensions do not relax...... Java’s type system as the same functionality can be achieved in pure Java, albeit much less efficiently....

  10. Comparison of Machine Learning methods for incipient motion in gravel bed rivers

    Science.gov (United States)

    Valyrakis, Manousos

    2013-04-01

    Soil erosion and sediment transport of natural gravel bed streams are important processes which affect both the morphology as well as the ecology of earth's surface. For gravel bed rivers at near incipient flow conditions, particle entrainment dynamics are highly intermittent. This contribution reviews the use of modern Machine Learning (ML) methods implemented for short term prediction of entrainment instances of individual grains exposed in fully developed near boundary turbulent flows. Results obtained by network architectures of variable complexity based on two different ML methods namely the Artificial Neural Network (ANN) and the Adaptive Neuro-Fuzzy Inference System (ANFIS) are compared in terms of different error and performance indices, computational efficiency and complexity as well as predictive accuracy and forecast ability. Different model architectures are trained and tested with experimental time series obtained from mobile particle flume experiments. The experimental setup consists of a Laser Doppler Velocimeter (LDV) and a laser optics system, which acquire data for the instantaneous flow and particle response respectively, synchronously. The first is used to record the flow velocity components directly upstream of the test particle, while the later tracks the particle's displacements. The lengthy experimental data sets (millions of data points) are split into the training and validation subsets used to perform the corresponding learning and testing of the models. It is demonstrated that the ANFIS hybrid model, which is based on neural learning and fuzzy inference principles, better predicts the critical flow conditions above which sediment transport is initiated. In addition, it is illustrated that empirical knowledge can be extracted, validating the theoretical assumption that particle ejections occur due to energetic turbulent flow events. Such a tool may find application in management and regulation of stream flows downstream of dams for stream

  11. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  12. Nanotechnology and statistical inference

    Science.gov (United States)

    Vesely, Sara; Vesely, Leonardo; Vesely, Alessandro

    2017-08-01

    We discuss some problems that arise when applying statistical inference to data with the aim of disclosing new func-tionalities. A predictive model analyzes the data taken from experiments on a specific material to assess the likelihood that another product, with similar structure and properties, will exhibit the same functionality. It doesn't have much predictive power if vari-ability occurs as a consequence of a specific, non-linear behavior. We exemplify our discussion on some experiments with biased dice.

  13. Foundations of Inference

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2012-06-01

    Full Text Available We present a simple and clear foundation for finite inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying lattices of logical statements in a way that satisfies general lattice symmetries. With other applications such as measure theory in mind, our derivations assume minimal symmetries, relying on neither negation nor continuity nor differentiability. Each relevant symmetry corresponds to an axiom of quantification, and these axioms are used to derive a unique set of quantifying rules that form the familiar probability calculus. We also derive a unique quantification of divergence, entropy and information.

  14. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  15. Generic patch inference

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia

    2010-01-01

    A key issue in maintaining Linux device drivers is the need to keep them up to date with respect to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spdiff, that identifies common changes...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  16. Architecture as Design Study.

    Science.gov (United States)

    Kauppinen, Heta

    1989-01-01

    Explores the use of analogies in architectural design, the importance of Gestalt theory and aesthetic cannons in understanding and being sensitive to architecture. Emphasizes the variation between public and professional appreciation of architecture. Notes that an understanding of architectural process enables students to improve the aesthetic…

  17. Architectural design decisions

    NARCIS (Netherlands)

    Jansen, Antonius Gradus Johannes

    2008-01-01

    A software architecture can be considered as the collection of key decisions concerning the design of the software of a system. Knowledge about this design, i.e. architectural knowledge, is key for understanding a software architecture and thus the software itself. Architectural knowledge is mostly

  18. Can You Hear Architecture

    DEFF Research Database (Denmark)

    Ryhl, Camilla

    2016-01-01

    Taking an off set in the understanding of architectural quality being based on multisensory architecture, the paper aims to discuss the current acoustic discourse in inclusive design and its implications to the integration of inclusive design in architectural discourse and practice as well...... design and architectural quality for people with a hearing disability and a newly conducted qualitative evaluation research in Denmark as well as architectural theories on multisensory aspects of architectural experiences, the paper uses examples of existing Nordic building cases to discuss the role...... of acoustics in both inclusive design and multisensory architecture....

  19. Rosen's (M,R) system as an X-machine.

    Science.gov (United States)

    Palmer, Michael L; Williams, Richard A; Gatherer, Derek

    2016-11-07

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly both irreducible to sub-models of its component states and non-computable on a Turing machine. (M,R) stands as an obstacle to both reductionist and mechanistic presentations of systems biology, principally due to its self-referential structure. If (M,R) has the properties claimed for it, computational systems biology will not be possible, or at best will be a science of approximate simulations rather than accurate models. Several attempts have been made, at both empirical and theoretical levels, to disprove this assertion by instantiating (M,R) in software architectures. So far, these efforts have been inconclusive. In this paper, we attempt to demonstrate why - by showing how both finite state machine and stream X-machine formal architectures fail to capture the self-referential requirements of (M,R). We then show that a solution may be found in communicating X-machines, which remove self-reference using parallel computation, and then synthesise such machine architectures with object-orientation to create a formal basis for future software instantiations of (M,R) systems.

  20. Machine learning analysis of binaural rowing sounds

    DEFF Research Database (Denmark)

    Johard, Leonard; Ruffaldi, Emanuele; Hoffmann, Pablo F.

    2011-01-01

    Techniques for machine hearing are increasing their potentiality due to new application domains. In this work we are addressing the analysis of rowing sounds in natural context for the purpose of supporting a training system based on virtual environments. This paper presents the acquisition metho...... methodology and the evaluation of different machine learning techniques for classifying rowing-sound data. We see that a combination of principal component analysis and shallow networks perform equally well as deep architectures, while being much faster to train.......Techniques for machine hearing are increasing their potentiality due to new application domains. In this work we are addressing the analysis of rowing sounds in natural context for the purpose of supporting a training system based on virtual environments. This paper presents the acquisition...

  1. Integration of process planning and production scheduling with particle swarm optimization (PSO) algorithm and fuzzy inference systems

    Science.gov (United States)

    Yang, Yahong; Zhao, Fuqing; Hong, Yi; Yu, Dongmei

    2005-12-01

    Integration of process planning with scheduling by considering the manufacturing system's capacity, cost and capacity in its workshop is a critical issue. The concurrency between them can also eliminate the redundant process and optimize the entire production cycle, but most integrated process planning and scheduling methods only consider the time aspects of the alternative machines when constructing schedules. In this paper, a fuzzy inference system (FIS) in choosing alternative machines for integrated process planning and scheduling of a job shop manufacturing system is presented. Instead of choosing alternative machines randomly, machines are being selected based on the machines reliability. The mean time to failure (MTF) values is input in a fuzzy inference mechanism, which outputs the machine reliability. The machine is then being penalized based on the fuzzy output. The most reliable machine will have the higher priority to be chosen. In order to overcome the problem of un-utilization machines, sometimes faced by unreliable machine, the particle swarm optimization (PSO) have been used to balance the load for all the machines. Simulation study shows that the system can be used as an alternative way of choosing machines in integrated process planning and scheduling.

  2. Statistical inferences in phylogeography

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Beaumont, Mark A

    2009-01-01

    In conventional phylogeographic studies, historical demographic processes are elucidated from the geographical distribution of individuals represented on an inferred gene tree. However, the interpretation of gene trees in this context can be difficult as the same demographic/geographical process ...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....... can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...

  3. Moment inference from tomograms

    Science.gov (United States)

    Day-Lewis, F. D.; Chen, Y.; Singha, K.

    2007-01-01

    Time-lapse geophysical tomography can provide valuable qualitative insights into hydrologic transport phenomena associated with aquifer dynamics, tracer experiments, and engineered remediation. Increasingly, tomograms are used to infer the spatial and/or temporal moments of solute plumes; these moments provide quantitative information about transport processes (e.g., advection, dispersion, and rate-limited mass transfer) and controlling parameters (e.g., permeability, dispersivity, and rate coefficients). The reliability of moments calculated from tomograms is, however, poorly understood because classic approaches to image appraisal (e.g., the model resolution matrix) are not directly applicable to moment inference. Here, we present a semi-analytical approach to construct a moment resolution matrix based on (1) the classic model resolution matrix and (2) image reconstruction from orthogonal moments. Numerical results for radar and electrical-resistivity imaging of solute plumes demonstrate that moment values calculated from tomograms depend strongly on plume location within the tomogram, survey geometry, regularization criteria, and measurement error. Copyright 2007 by the American Geophysical Union.

  4. Affecting Change in Architectural Education

    Directory of Open Access Journals (Sweden)

    Leonard R. Bachman

    2012-09-01

    Full Text Available Architecture concerns not so much an explicit body of transmittable knowledge and protocols as it does a set of implicit understandings, sensitivities and sensibilities. The education of an architect therefore concerns the mission of endowing candidates with those implicit traits. This is not to say that architects do not possess and wield prodigious amounts of explicit cognitive knowledge, because they certainly do. But that explicit component of architectural know-how is actually vested in and deployed by the architect not so much because the knowledge has been invented, discovered, or developed by architects; but rather because they have assimilated it from other disciplines in a special way that gives architects adductive and hermeneutic insight into vast, detailed, and complex design challenges. Engineers make better machines, artists make more meaningful artifacts, and psychologists provide better human environments; but architects are trained to see the underlying opportunity and potential celebration of how those constituent menus might become a feast. In any unresolved complex of space, material and form, architects grasp a unique essence in how they perceive the “happily ever after” of what it might be and how that vision might be made whole and concrete. By the time a student of architecture is fully indoctrinated, this grasp of an underlying ideal essence is so potent that it becomes the student’s identity… and the purpose of that insight becomes an irresistible intention.

  5. Scaling Support Vector Machines On Modern HPC Platforms

    Energy Technology Data Exchange (ETDEWEB)

    You, Yang; Fu, Haohuan; Song, Shuaiwen; Randles, Amanda; Kerbyson, Darren J.; Marquez, Andres; Yang, Guangwen; Hoisie, Adolfy

    2015-02-01

    We designed and implemented MIC-SVM, a highly efficient parallel SVM for x86 based multicore and many-core architectures, such as the Intel Ivy Bridge CPUs and Intel Xeon Phi co-processor (MIC). We propose various novel analysis methods and optimization techniques to fully utilize the multilevel parallelism provided by these architectures and serve as general optimization methods for other machine learning tools.

  6. Modular reconfigurable machine tools: design, control and evaluation

    CSIR Research Space (South Africa)

    Padayachee, J

    2009-11-01

    Full Text Available -process capacity scaling. Scalable production capacity and adjustable system functionality are the key objectives of reconfigurable manufacturing. Index terms: Reconfigurable Manufacturing Systems, Modular Reconfigurable Machines, Open Architecture Control...] identify the fixed mechanical architectures and proprietary control systems found in CNC and DMT equipment as the specific drawback in effectively implementing these classes of equipment in RMS. Koren et al.[3] proposed the development of reconfigurable...

  7. Design of Demining Machines

    CERN Document Server

    Mikulic, Dinko

    2013-01-01

    In constant effort to eliminate mine danger, international mine action community has been developing safety, efficiency and cost-effectiveness of clearance methods. Demining machines have become necessary when conducting humanitarian demining where the mechanization of demining provides greater safety and productivity. Design of Demining Machines describes the development and testing of modern demining machines in humanitarian demining.   Relevant data for design of demining machines are included to explain the machinery implemented and some innovative and inspiring development solutions. Development technologies, companies and projects are discussed to provide a comprehensive estimate of the effects of various design factors and to proper selection of optimal parameters for designing the demining machines.   Covering the dynamic processes occurring in machine assemblies and their components to a broader understanding of demining machine as a whole, Design of Demining Machines is primarily tailored as a tex...

  8. Applied machining technology

    CERN Document Server

    Tschätsch, Heinz

    2010-01-01

    Machining and cutting technologies are still crucial for many manufacturing processes. This reference presents all important machining processes in a comprehensive and coherent way. It includes many examples of concrete calculations, problems and solutions.

  9. Machining with abrasives

    CERN Document Server

    Jackson, Mark J

    2011-01-01

    Abrasive machining is key to obtaining the desired geometry and surface quality in manufacturing. This book discusses the fundamentals and advances in the abrasive machining processes. It provides a complete overview of developing areas in the field.

  10. Women, Men, and Machines.

    Science.gov (United States)

    Form, William; McMillen, David Byron

    1983-01-01

    Data from the first national study of technological change show that proportionately more women than men operate machines, are more exposed to machines that have alienating effects, and suffer more from the negative effects of technological change. (Author/SSH)

  11. Machine medical ethics

    CERN Document Server

    Pontier, Matthijs

    2015-01-01

    The essays in this book, written by researchers from both humanities and sciences, describe various theoretical and experimental approaches to adding medical ethics to a machine in medical settings. Medical machines are in close proximity with human beings, and getting closer: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. In such contexts, machines are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for e...

  12. Brain versus Machine Control.

    Directory of Open Access Journals (Sweden)

    Jose M Carmena

    2004-12-01

    Full Text Available Dr. Octopus, the villain of the movie "Spiderman 2", is a fusion of man and machine. Neuroscientist Jose Carmena examines the facts behind this fictional account of a brain- machine interface

  13. A Universal Reactive Machine

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Mørk, Simon; Sørensen, Morten U.

    1997-01-01

    Turing showed the existence of a model universal for the set of Turing machines in the sense that given an encoding of any Turing machine asinput the universal Turing machine simulates it. We introduce the concept of universality for reactive systems and construct a CCS processuniversal...

  14. SPEEDY: An Eclipse-based IDE for invariant inference

    Directory of Open Access Journals (Sweden)

    David R. Cok

    2014-04-01

    Full Text Available SPEEDY is an Eclipse-based IDE for exploring techniques that assist users in generating correct specifications, particularly including invariant inference algorithms and tools. It integrates with several back-end tools that propose invariants and will incorporate published algorithms for inferring object and loop invariants. Though the architecture is language-neutral, current SPEEDY targets C programs. Building and using SPEEDY has confirmed earlier experience demonstrating the importance of showing and editing specifications in the IDEs that developers customarily use, automating as much of the production and checking of specifications as possible, and showing counterexample information directly in the source code editing environment. As in previous work, automation of specification checking is provided by back-end SMT solvers. However, reducing the effort demanded of software developers using formal methods also requires a GUI design that guides users in writing, reviewing, and correcting specifications and automates specification inference.

  15. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...

  16. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...... then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  17. Inferring attitudes from mindwandering.

    Science.gov (United States)

    Critcher, Clayton R; Gilovich, Thomas

    2010-09-01

    Self-perception theory posits that people understand their own attitudes and preferences much as they understand others', by interpreting the meaning of their behavior in light of the context in which it occurs. Four studies tested whether people also rely on unobservable "behavior," their mindwandering, when making such inferences. It is proposed here that people rely on the content of their mindwandering to decide whether it reflects boredom with an ongoing task or a reverie's irresistible pull. Having the mind wander to positive events, to concurrent as opposed to past activities, and to many events rather than just one tends to be attributed to boredom and therefore leads to perceived dissatisfaction with an ongoing task. Participants appeared to rely spontaneously on the content of their wandering minds as a cue to their attitudes, but not when an alternative cause for their mindwandering was made salient.

  18. Bayesian inference in geomagnetism

    Science.gov (United States)

    Backus, George E.

    1988-01-01

    The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

  19. Inferring the eccentricity distribution

    CERN Document Server

    Hogg, David W; Bovy, Jo

    2010-01-01

    Standard maximum-likelihood estimators for binary-star and exoplanet eccentricities are biased high, in the sense that the estimated eccentricity tends to be larger than the true eccentricity. As with most non-trivial observables, a simple histogram of estimated eccentricities is not a good estimate of the true eccentricity distribution. Here we develop and test a hierarchical probabilistic method for performing the relevant meta-analysis, that is, inferring the true eccentricity distribution, taking as input the likelihood functions for the individual-star eccentricities, or samplings of the posterior probability distributions for the eccentricities (under a given, uninformative prior). The method is a simple implementation of a hierarchical Bayesian model; it can also be seen as a kind of heteroscedastic deconvolution. It can be applied to any quantity measured with finite precision--other orbital parameters, or indeed any astronomical measurements of any kind, including magnitudes, parallaxes, or photometr...

  20. Inferring deterministic causal relations

    CERN Document Server

    Daniusis, Povilas; Mooij, Joris; Zscheischler, Jakob; Steudel, Bastian; Zhang, Kun; Schoelkopf, Bernhard

    2012-01-01

    We consider two variables that are related to each other by an invertible function. While it has previously been shown that the dependence structure of the noise can provide hints to determine which of the two variables is the cause, we presently show that even in the deterministic (noise-free) case, there are asymmetries that can be exploited for causal inference. Our method is based on the idea that if the function and the probability density of the cause are chosen independently, then the distribution of the effect will, in a certain sense, depend on the function. We provide a theoretical analysis of this method, showing that it also works in the low noise regime, and link it to information geometry. We report strong empirical results on various real-world data sets from different domains.

  1. Asynchronized synchronous machines

    CERN Document Server

    Botvinnik, M M

    1964-01-01

    Asynchronized Synchronous Machines focuses on the theoretical research on asynchronized synchronous (AS) machines, which are "hybrids” of synchronous and induction machines that can operate with slip. Topics covered in this book include the initial equations; vector diagram of an AS machine; regulation in cases of deviation from the law of full compensation; parameters of the excitation system; and schematic diagram of an excitation regulator. The possible applications of AS machines and its calculations in certain cases are also discussed. This publication is beneficial for students and indiv

  2. Quantum machine learning.

    Science.gov (United States)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  3. Precision machine design

    CERN Document Server

    Slocum, Alexander H

    1992-01-01

    This book is a comprehensive engineering exploration of all the aspects of precision machine design - both component and system design considerations for precision machines. It addresses both theoretical analysis and practical implementation providing many real-world design case studies as well as numerous examples of existing components and their characteristics. Fast becoming a classic, this book includes examples of analysis techniques, along with the philosophy of the solution method. It explores the physics of errors in machines and how such knowledge can be used to build an error budget for a machine, how error budgets can be used to design more accurate machines.

  4. Admissibility of logical inference rules

    CERN Document Server

    Rybakov, VV

    1997-01-01

    The aim of this book is to present the fundamental theoretical results concerning inference rules in deductive formal systems. Primary attention is focused on: admissible or permissible inference rules the derivability of the admissible inference rules the structural completeness of logics the bases for admissible and valid inference rules. There is particular emphasis on propositional non-standard logics (primary, superintuitionistic and modal logics) but general logical consequence relations and classical first-order theories are also considered. The book is basically self-contained and

  5. SCADA Architecture for Natural Gas plant

    Directory of Open Access Journals (Sweden)

    Turc Traian

    2009-12-01

    Full Text Available The paper describes the Natural Gas Plant SCADA architecture. The main purpose of SCADA system is remote monitoring and controlling of any industrial plant. The SCADA hardware architecture is based on multi-dropping system allowing connecting a large number of different fiels devices. The SCADA Server gathers data from gas plant and stores data to a MtSQL database. The SCADA server is connected to other SCADA client application offers a intuitive and user-friendly HMI. The main benefit of using SCADA is real time displaying of gas plant state. The main contriobution of the authors consists in designing SCADA architecture based on multi-dropping system and Human Machine Interface.

  6. X: A Comprehensive Analytic Model for Parallel Machines

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ang; Song, Shuaiwen; Brugel, Eric; Kumar, Akash; Chavarría-Miranda, Daniel; Corporaal, Henk

    2016-05-23

    To continuously comply with Moore’s Law, modern parallel machines become increasingly complex. Effectively tuning application performance for these machines therefore becomes a daunting task. Moreover, identifying performance bottlenecks at application and architecture level, as well as evaluating various optimization strategies, are becoming extremely difficult when the entanglement of numerous correlated factors is being presented. To tackle these challenges, we present a visual analytical model named “X”. It is intuitive and sufficiently flexible to track all the typical features of a parallel machine.

  7. Kinematic performance analysis of a parallel-chain hexapod machine

    Energy Technology Data Exchange (ETDEWEB)

    Jing Song; Jong-I Mou; Calvin King

    1998-05-18

    Inverse and forward kinematic models were derived to analyze the performance of a parallel-chain hexapod machine. Analytical models were constructed for both ideal and real structures. Performance assessment and enhancement algorithms were developed to determine the strut lengths for both ideal and real structures. The strut lengths determined from both cases can be used to analyze the effect of structural imperfections on machine performance. In an open-architecture control environment, strut length errors can be fed back to the controller to compensate for the displacement errors and thus improve the machine's accuracy in production.

  8. Neural networks for perception human and machine perception

    CERN Document Server

    Wechsler, Harry

    1991-01-01

    Neural Networks for Perception, Volume 1: Human and Machine Perception focuses on models for understanding human perception in terms of distributed computation and examples of PDP models for machine perception. This book addresses both theoretical and practical issues related to the feasibility of both explaining human perception and implementing machine perception in terms of neural network models. The book is organized into two parts. The first part focuses on human perception. Topics on network model ofobject recognition in human vision, the self-organization of functional architecture in t

  9. Bootstrapping phylogenies inferred from rearrangement data

    Directory of Open Access Journals (Sweden)

    Lin Yu

    2012-08-01

    Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its

  10. Exascale Hardware Architectures Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Hemmert, S; Ang, J; Chiang, P; Carnes, B; Doerfler, D; Leininger, M; Dosanjh, S; Fields, P; Koch, K; Laros, J; Noe, J; Quinn, T; Torrellas, J; Vetter, J; Wampler, C; White, A

    2011-03-15

    The ASC Exascale Hardware Architecture working group is challenged to provide input on the following areas impacting the future use and usability of potential exascale computer systems: processor, memory, and interconnect architectures, as well as the power and resilience of these systems. Going forward, there are many challenging issues that will need to be addressed. First, power constraints in processor technologies will lead to steady increases in parallelism within a socket. Additionally, all cores may not be fully independent nor fully general purpose. Second, there is a clear trend toward less balanced machines, in terms of compute capability compared to memory and interconnect performance. In order to mitigate the memory issues, memory technologies will introduce 3D stacking, eventually moving on-socket and likely on-die, providing greatly increased bandwidth but unfortunately also likely providing smaller memory capacity per core. Off-socket memory, possibly in the form of non-volatile memory, will create a complex memory hierarchy. Third, communication energy will dominate the energy required to compute, such that interconnect power and bandwidth will have a significant impact. All of the above changes are driven by the need for greatly increased energy efficiency, as current technology will prove unsuitable for exascale, due to unsustainable power requirements of such a system. These changes will have the most significant impact on programming models and algorithms, but they will be felt across all layers of the machine. There is clear need to engage all ASC working groups in planning for how to deal with technological changes of this magnitude. The primary function of the Hardware Architecture Working Group is to facilitate codesign with hardware vendors to ensure future exascale platforms are capable of efficiently supporting the ASC applications, which in turn need to meet the mission needs of the NNSA Stockpile Stewardship Program. This issue is

  11. Modeling Architectural Patterns' Behavior Using Architectural Primitives

    NARCIS (Netherlands)

    Kamal, Ahmad Waqas; Avgeriou, Paris; Morrison, R; Balasubramaniam, D; Falkner, K

    2008-01-01

    Architectural patterns have an impact on both the structure and the behavior of a system at the architecture design level. However, it is challenging to model patterns' behavior in a systematic way because modeling languages do not provide the appropriate abstractions and because each pattern

  12. Modeling Architectural Patterns’ Behavior Using Architectural Primitives

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Architectural patterns have an impact on both the structure and the behavior of a system at the architecture design level. However, it is challenging to model patterns’ behavior in a systematic way because modeling languages do not provide the appropriate abstractions and because each pattern

  13. Perspex machine: VII. The universal perspex machine

    Science.gov (United States)

    Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of projective geometry with the Turing machine. It uses a total arithmetic, called transreal arithmetic, that contains real arithmetic and allows division by zero. Transreal arithmetic is redefined here. The new arithmetic has both a positive and a negative infinity which lie at the extremes of the number line, and a number nullity that lies off the number line. We prove that nullity, 0/0, is a number. Hence a number may have one of four signs: negative, zero, positive, or nullity. It is, therefore, impossible to encode the sign of a number in one bit, as floating-point arithmetic attempts to do, resulting in the difficulty of having both positive and negative zeros and NaNs. Transrational arithmetic is consistent with Cantor arithmetic. In an extension to real arithmetic, the product of zero, an infinity, or nullity with its reciprocal is nullity, not unity. This avoids the usual contradictions that follow from allowing division by zero. Transreal arithmetic has a fixed algebraic structure and does not admit options as IEEE, floating-point arithmetic does. Most significantly, nullity has a simple semantics that is related to zero. Zero means "no value" and nullity means "no information." We argue that nullity is as useful to a manufactured computer as zero is to a human computer. The perspex machine is intended to offer one solution to the mind-body problem by showing how the computable aspects of mind and, perhaps, the whole of mind relates to the geometrical aspects of body and, perhaps, the whole of body. We review some of Turing's writings and show that he held the view that his machine has spatial properties. In particular, that it has the property of being a 7D lattice of compact spaces. Thus, we read Turing as believing that his machine relates computation to geometrical bodies. We simplify the perspex machine by substituting an augmented Euclidean geometry for projective geometry. This leads to a general

  14. The IVOA Architecture

    Science.gov (United States)

    Arviset, C.; Gaudet, S.; IVOA Technical Coordination Group

    2012-09-01

    Astronomy produces large amounts of data of many kinds, coming from various sources: science space missions, ground based telescopes, theoretical models, compilation of results, etc. These data and associated processing services are made available via the Internet by "providers", usually large data centres or smaller teams (see Figure 1). The "consumers", be they individual researchers, research teams or computer systems, access these services to do their science. However, inter-connection amongst all these services and between providers and consumers is usually not trivial. The Virtual Observatory (VO) is the necessary "middle layer" framework enabling interoperability between all these providers and consumers in a seamless and transparent manner. Like the web which enables end users and machines to access transparently documents and services wherever and however they are stored, the VO enables the astronomy community to access data and service resources wherever and however they are provided. Over the last decade, the International Virtual Observatory Alliance (IVOA) has been defining various standards to build the VO technical framework for the providers to share their data and services ("Sharing"), and to allow users to find ("Finding") these resources, to get them ("Getting") and to use them ("Using"). To enable these functionalities, the definition of some core astronomically-oriented standards ("VO Core") has also been necessary. This paper will present the official and current IVOA Architecture[1], describing the various building blocks of the VO framework (see Figure 2) and their relation to all existing and in-progress IVOA standards. Additionally, it will show examples of these standards in action, connecting VO "consumers" to VO "providers".

  15. Exporting Humanist Architecture

    DEFF Research Database (Denmark)

    Nielsen, Tom

    2016-01-01

    values and ethical stands involved in the export of Danish Architecture. Abstract: Danish architecture has, in a sense, been driven by an unwritten contract between the architects and the democratic state and its institutions. This contract may be viewed as an ethos – an architectural tradition...... with inherent aesthetic and moral values. Today, however, Danish architecture is also an export commodity. That raises questions, which should be debated as openly as possible. What does it mean for architecture and architects to practice in cultures and under political systems that do not use architecture...... as a way of generating humanism, freedom or equality? The essay outlines the background story, identifies a number of positions in relation to architecture exports and discusses some of the dilemmas that arise when Danish architecture is seen in an export perspective....

  16. Religious architecture: anthropological perspectives

    NARCIS (Netherlands)

    O. Verkaaik

    2013-01-01

    Religious Architecture: Anthropological Perspectives develops an anthropological perspective on modern religious architecture, including mosques, churches and synagogues. Borrowing from a range of theoretical perspectives on space-making and material religion, this volume looks at how religious buil

  17. Rhein-Ruhr architecture

    DEFF Research Database (Denmark)

    2002-01-01

    katalog til udstillingen 'Rhein - Ruhr architecture' Meldahls smedie, 15. marts - 28. april 2002. 99 sider......katalog til udstillingen 'Rhein - Ruhr architecture' Meldahls smedie, 15. marts - 28. april 2002. 99 sider...

  18. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framework...

  19. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2017-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  20. Causal Inference and Developmental Psychology

    Science.gov (United States)

    Foster, E. Michael

    2010-01-01

    Causal inference is of central importance to developmental psychology. Many key questions in the field revolve around improving the lives of children and their families. These include identifying risk factors that if manipulated in some way would foster child development. Such a task inherently involves causal inference: One wants to know whether…

  1. Causal Inference and Developmental Psychology

    Science.gov (United States)

    Foster, E. Michael

    2010-01-01

    Causal inference is of central importance to developmental psychology. Many key questions in the field revolve around improving the lives of children and their families. These include identifying risk factors that if manipulated in some way would foster child development. Such a task inherently involves causal inference: One wants to know whether…

  2. Machine Phase Fullerene Nanotechnology: 1996

    Science.gov (United States)

    Globus, Al; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    NASA has used exotic materials for spacecraft and experimental aircraft to good effect for many decades. In spite of many advances, transportation to space still costs about $10,000 per pound. Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. These studies and others suggest enormous potential for aerospace systems. Unfortunately, methods to realize diamonoid nanotechnology are at best highly speculative. Recent computational efforts at NASA Ames Research Center and computation and experiment elsewhere suggest that a nanotechnology of machine phase functionalized fullerenes may be synthetically relatively accessible and of great aerospace interest. Machine phase materials are (hypothetical) materials consisting entirely or in large part of microscopic machines. In a sense, most living matter fits this definition. To begin investigation of fullerene nanotechnology, we used molecular dynamics to study the properties of carbon nanotube based gears and gear/shaft configurations. Experiments on C60 and quantum calculations suggest that benzyne may react with carbon nanotubes to form gear teeth. Han has computationally demonstrated that molecular gears fashioned from (14,0) single-walled carbon nanotubes and benzyne teeth should operate well at 50-100 gigahertz. Results suggest that rotation can be converted to rotating or linear motion, and linear motion may be converted into rotation. Preliminary results suggest that these mechanical systems can be cooled by a helium atmosphere. Furthermore, Deepak has successfully simulated using helical electric fields generated by a laser to power fullerene gears once a positive and negative charge have been added to form a dipole. Even with mechanical motion, cooling, and power; creating a viable nanotechnology requires support structures, computer control, a system architecture, a variety of components, and some approach to manufacture. Additional

  3. Variational Program Inference

    CERN Document Server

    Harik, Georges

    2010-01-01

    We introduce a framework for representing a variety of interesting problems as inference over the execution of probabilistic model programs. We represent a "solution" to such a problem as a guide program which runs alongside the model program and influences the model program's random choices, leading the model program to sample from a different distribution than from its priors. Ideally the guide program influences the model program to sample from the posteriors given the evidence. We show how the KL- divergence between the true posterior distribution and the distribution induced by the guided model program can be efficiently estimated (up to an additive constant) by sampling multiple executions of the guided model program. In addition, we show how to use the guide program as a proposal distribution in importance sampling to statistically prove lower bounds on the probability of the evidence and on the probability of a hypothesis and the evidence. We can use the quotient of these two bounds as an estimate of ...

  4. Gauging Variational Inference

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ahn, Sungsoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of); Shin, Jinwoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2017-05-25

    Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used to resolve the issue in practice, where meanfield (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments, on complete GMs of relatively small size and on large GM (up-to 300 variables) confirm that the newly proposed algorithms outperform and generalize MF and BP.

  5. Statistical Inference and String Theory

    CERN Document Server

    Heckman, Jonathan J

    2013-01-01

    In this note we expose some surprising connections between string theory and statistical inference. We consider a large collective of agents sweeping out a family of nearby statistical models for an M-dimensional manifold of statistical fitting parameters. When the agents making nearby inferences align along a d-dimensional grid, we find that the pooled probability that the collective reaches a correct inference is the partition function of a non-linear sigma model in d dimensions. Stability under perturbations to the original inference scheme requires the agents of the collective to distribute along two dimensions. Conformal invariance of the sigma model corresponds to the condition of a stable inference scheme, directly leading to the Einstein field equations for classical gravity. By summing over all possible arrangements of the agents in the collective, we reach a string theory. We also use this perspective to quantify how much an observer can hope to learn about the internal geometry of a superstring com...

  6. Architectural Knitted Surfaces

    DEFF Research Database (Denmark)

    Mossé, Aurélie

    2010-01-01

    WGSN reports from the Architectural Knitted Surfaces workshop recently held at ShenkarCollege of Engineering and Design, Tel Aviv, which offered a cutting-edge insight into interactive knitted surfaces. With the increasing role of smart textiles in architecture, the Architectural Knitted Surfaces...

  7. Engineering artificial machines from designable DNA materials for biomedical applications.

    Science.gov (United States)

    Qi, Hao; Huang, Guoyou; Han, Yulong; Zhang, Xiaohui; Li, Yuhui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng; Wang, Lin

    2015-06-01

    Deoxyribonucleic acid (DNA) emerges as building bricks for the fabrication of nanostructure with complete artificial architecture and geometry. The amazing ability of DNA in building two- and three-dimensional structures raises the possibility of developing smart nanomachines with versatile controllability for various applications. Here, we overviewed the recent progresses in engineering DNA machines for specific bioengineering and biomedical applications.

  8. TensorFlow: A system for large-scale machine learning

    OpenAIRE

    2016-01-01

    TensorFlow is a machine learning system that operates at large scale and in heterogeneous environments. TensorFlow uses dataflow graphs to represent computation, shared state, and the operations that mutate that state. It maps the nodes of a dataflow graph across many machines in a cluster, and within a machine across multiple computational devices, including multicore CPUs, general-purpose GPUs, and custom designed ASICs known as Tensor Processing Units (TPUs). This architecture gives flexib...

  9. Machinability of advanced materials

    CERN Document Server

    Davim, J Paulo

    2014-01-01

    Machinability of Advanced Materials addresses the level of difficulty involved in machining a material, or multiple materials, with the appropriate tooling and cutting parameters.  A variety of factors determine a material's machinability, including tool life rate, cutting forces and power consumption, surface integrity, limiting rate of metal removal, and chip shape. These topics, among others, and multiple examples comprise this research resource for engineering students, academics, and practitioners.

  10. Pattern recognition & machine learning

    CERN Document Server

    Anzai, Y

    1992-01-01

    This is the first text to provide a unified and self-contained introduction to visual pattern recognition and machine learning. It is useful as a general introduction to artifical intelligence and knowledge engineering, and no previous knowledge of pattern recognition or machine learning is necessary. Basic for various pattern recognition and machine learning methods. Translated from Japanese, the book also features chapter exercises, keywords, and summaries.

  11. Support vector machines applications

    CERN Document Server

    Guo, Guodong

    2014-01-01

    Support vector machines (SVM) have both a solid mathematical background and good performance in practical applications. This book focuses on the recent advances and applications of the SVM in different areas, such as image processing, medical practice, computer vision, pattern recognition, machine learning, applied statistics, business intelligence, and artificial intelligence. The aim of this book is to create a comprehensive source on support vector machine applications, especially some recent advances.

  12. Machining of titanium alloys

    CERN Document Server

    2014-01-01

    This book presents a collection of examples illustrating the resent research advances in the machining of titanium alloys. These materials have excellent strength and fracture toughness as well as low density and good corrosion resistance; however, machinability is still poor due to their low thermal conductivity and high chemical reactivity with cutting tool materials. This book presents solutions to enhance machinability in titanium-based alloys and serves as a useful reference to professionals and researchers in aerospace, automotive and biomedical fields.

  13. Rotating electrical machines

    CERN Document Server

    Le Doeuff, René

    2013-01-01

    In this book a general matrix-based approach to modeling electrical machines is promulgated. The model uses instantaneous quantities for key variables and enables the user to easily take into account associations between rotating machines and static converters (such as in variable speed drives).   General equations of electromechanical energy conversion are established early in the treatment of the topic and then applied to synchronous, induction and DC machines. The primary characteristics of these machines are established for steady state behavior as well as for variable speed scenarios. I

  14. Chaotic Boltzmann machines.

    Science.gov (United States)

    Suzuki, Hideyuki; Imura, Jun-ichi; Horio, Yoshihiko; Aihara, Kazuyuki

    2013-01-01

    The chaotic Boltzmann machine proposed in this paper is a chaotic pseudo-billiard system that works as a Boltzmann machine. Chaotic Boltzmann machines are shown numerically to have computing abilities comparable to conventional (stochastic) Boltzmann machines. Since no randomness is required, efficient hardware implementation is expected. Moreover, the ferromagnetic phase transition of the Ising model is shown to be characterised by the largest Lyapunov exponent of the proposed system. In general, a method to relate probabilistic models to nonlinear dynamics by derandomising Gibbs sampling is presented.

  15. Tribology in machine design

    CERN Document Server

    Stolarski, Tadeusz

    1999-01-01

    ""Tribology in Machine Design is strongly recommended for machine designers, and engineers and scientists interested in tribology. It should be in the engineering library of companies producing mechanical equipment.""Applied Mechanics ReviewTribology in Machine Design explains the role of tribology in the design of machine elements. It shows how algorithms developed from the basic principles of tribology can be used in a range of practical applications within mechanical devices and systems.The computer offers today's designer the possibility of greater stringen

  16. Debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Miller, P.; Pizzi, R.

    1994-09-02

    A computer program is really nothing more than a virtual machine built to perform a task. The program`s source code expresses abstract constructs using low level language features. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low level machine implementation in formation to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design into the source code. We introduce OODIE, an object-oriented language extension that allows programmers to specify a virtual debugging environment which includes the design and abstract data types of the virtual machine.

  17. Electrical machines & drives

    CERN Document Server

    Hammond, P

    1985-01-01

    Containing approximately 200 problems (100 worked), the text covers a wide range of topics concerning electrical machines, placing particular emphasis upon electrical-machine drive applications. The theory is concisely reviewed and focuses on features common to all machine types. The problems are arranged in order of increasing levels of complexity and discussions of the solutions are included where appropriate to illustrate the engineering implications. This second edition includes an important new chapter on mathematical and computer simulation of machine systems and revised discussions o

  18. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2013-01-01

    Written as a tutorial to explore and understand the power of R for machine learning. This practical guide that covers all of the need to know topics in a very systematic way. For each machine learning approach, each step in the process is detailed, from preparing the data for analysis to evaluating the results. These steps will build the knowledge you need to apply them to your own data science tasks.Intended for those who want to learn how to use R's machine learning capabilities and gain insight from your data. Perhaps you already know a bit about machine learning, but have never used R; or

  19. Induction machine handbook

    CERN Document Server

    Boldea, Ion

    2002-01-01

    Often called the workhorse of industry, the advent of power electronics and advances in digital control are transforming the induction motor into the racehorse of industrial motion control. Now, the classic texts on induction machines are nearly three decades old, while more recent books on electric motors lack the necessary depth and detail on induction machines.The Induction Machine Handbook fills industry's long-standing need for a comprehensive treatise embracing the many intricate facets of induction machine analysis and design. Moving gradually from simple to complex and from standard to

  20. Multi-Level Audio Classification Architecture

    Directory of Open Access Journals (Sweden)

    Jozef Vavrek

    2015-01-01

    Full Text Available A multi-level classification architecture for solving binary discrimination problem is proposed in this paper. The main idea of proposed solution is derived from the fact that solving one binary discrimination problem multiple times can reduce the overall miss-classification error. We aimed our effort towards building the classification architecture employing the combination of multiple binary SVM (Support Vector Machine classifiers for solving two-class discrimination problem. Therefore, we developed a binary discrimination architecture employing the SVM classifier (BDASVM with intention to use it for classification of broadcast news (BN audio data. The fundamental element of BDASVM is the binary decision (BD algorithm that performs discrimination between each pair of acoustic classes utilizing decision function modeled by separating hyperplane. The overall classification accuracy is conditioned by finding the optimal parameters for discrimination function resulting in higher computational complexity. The final form of proposed BDASVM is created by combining four BDSVM discriminators supplemented by decision table. Experimental results show that the proposed classification architecture can decrease the overall classification error in comparison with binary decision trees SVM (BDTSVM architecture.

  1. Kernel methods for phenotyping complex plant architecture.

    Science.gov (United States)

    Kawamura, Koji; Hibrand-Saint Oyant, Laurence; Foucher, Fabrice; Thouroude, Tatiana; Loustau, Sébastien

    2014-02-07

    The Quantitative Trait Loci (QTL) mapping of plant architecture is a critical step for understanding the genetic determinism of plant architecture. Previous studies adopted simple measurements, such as plant-height, stem-diameter and branching-intensity for QTL mapping of plant architecture. Many of these quantitative traits were generally correlated to each other, which give rise to statistical problem in the detection of QTL. We aim to test the applicability of kernel methods to phenotyping inflorescence architecture and its QTL mapping. We first test Kernel Principal Component Analysis (KPCA) and Support Vector Machines (SVM) over an artificial dataset of simulated inflorescences with different types of flower distribution, which is coded as a sequence of flower-number per node along a shoot. The ability of discriminating the different inflorescence types by SVM and KPCA is illustrated. We then apply the KPCA representation to the real dataset of rose inflorescence shoots (n=1460) obtained from a 98 F1 hybrid mapping population. We find kernel principal components with high heritability (>0.7), and the QTL analysis identifies a new QTL, which was not detected by a trait-by-trait analysis of simple architectural measurements. The main tools developed in this paper could be use to tackle the general problem of QTL mapping of complex (sequences, 3D structure, graphs) phenotypic traits.

  2. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  3. Virtual machine vs Real Machine: Security Systems

    Directory of Open Access Journals (Sweden)

    Dr. C. Suresh Gnana Das

    2009-08-01

    Full Text Available This paper argues that the operating system and applications currently running on a real machine should relocate into a virtual machine. This structure enables services to be added below the operating system and to do so without trusting or modifying the operating system or applications. To demonstrate the usefulness of this structure, we describe three services that take advantage of it: secure logging, intrusion prevention and detection, and environment migration. In particular, we can provide services below the guest operating system without trusting or modifying it. We believe providing services at this layer are especially useful for enhancing security and mobility. This position paper describes the general benefits and challenges that arise from running most applications in a virtual machine, and then describes some example services and alternative ways to provide those services.

  4. Teaching machines to find mantle composition

    Science.gov (United States)

    Atkins, Suzanne; Tackley, Paul; Trampert, Jeannot; Valentine, Andrew

    2017-04-01

    The composition of the mantle affects many geodynamical processes by altering factors such as the density, the location of phase changes, and melting temperature. The inferences we make about mantle composition also determine how we interpret the changes in velocity, reflections, attenuation and scattering seen by seismologists. However, the bulk composition of the mantle is very poorly constrained. Inferences are made from meteorite samples, rock samples from the Earth and inferences made from geophysical data. All of these approaches require significant assumptions and the inferences made are subject to large uncertainties. Here we present a new method for inferring mantle composition, based on pattern recognition machine learning, which uses large scale in situ observations of the mantle to make fully probabilistic inferences of composition for convection simulations. Our method has an advantage over other petrological approaches because we use large scale geophysical observations. This means that we average over much greater length scales and do not need to rely on extrapolating from localised samples of the mantle or planetary disk. Another major advantage of our method is that it is fully probabilistic. This allows us to include all of the uncertainties inherent in the inference process, giving us far more information about the reliability of the result than other methods. Finally our method includes the impact of composition on mantle convection. This allows us to make much more precise inferences from geophysical data than other geophysical approaches, which attempt to invert one observation with no consideration of the relationship between convection and composition. We use a sampling based inversion method, using hundreds of convection simulations run using StagYY with self consistent mineral physics properties calculated using the PerpleX package. The observations from these simulations are used to train a neural network to make a probabilistic inference

  5. Optimization methods for logical inference

    CERN Document Server

    Chandru, Vijay

    2011-01-01

    Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in

  6. A Dualistic Model To Describe Computer Architectures

    Science.gov (United States)

    Nitezki, Peter; Engel, Michael

    1985-07-01

    The Dualistic Model for Computer Architecture Description uses a hierarchy of abstraction levels to describe a computer in arbitrary steps of refinement from the top of the user interface to the bottom of the gate level. In our Dualistic Model the description of an architecture may be divided into two major parts called "Concept" and "Realization". The Concept of an architecture on each level of the hierarchy is an Abstract Data Type that describes the functionality of the computer and an implementation of that data type relative to the data type of the next lower level of abstraction. The Realization on each level comprises a language describing the means of user interaction with the machine, and a processor interpreting this language in terms of the language of the lower level. The surface of each hierarchical level, the data type and the language express the behaviour of a ma-chine at this level, whereas the implementation and the processor describe the structure of the algorithms and the system. In this model the Principle of Operation maps the object and computational structure of the Concept onto the structures of the Realization. Describing a system in terms of the Dualistic Model is therefore a process of refinement starting at a mere description of behaviour and ending at a description of structure. This model has proven to be a very valuable tool in exploiting the parallelism in a problem and it is very transparent in discovering the points where par-allelism is lost in a special architecture. It has successfully been used in a project on a survey of Computer Architecture for Image Processing and Pattern Analysis in Germany.

  7. The ATLAS Higgs Machine Learning Challenge

    CERN Document Server

    Cowan, Glen; The ATLAS collaboration; Bourdarios, Claire

    2015-01-01

    High Energy Physics has been using Machine Learning techniques (commonly known as Multivariate Analysis) since the 1990s with Artificial Neural Net and more recently with Boosted Decision Trees, Random Forest etc. Meanwhile, Machine Learning has become a full blown field of computer science. With the emergence of Big Data, data scientists are developing new Machine Learning algorithms to extract meaning from large heterogeneous data. HEP has exciting and difficult problems like the extraction of the Higgs boson signal, and at the same time data scientists have advanced algorithms: the goal of the HiggsML project was to bring the two together by a “challenge”: participants from all over the world and any scientific background could compete online to obtain the best Higgs to tau tau signal significance on a set of ATLAS fully simulated Monte Carlo signal and background. Instead of HEP physicists browsing through machine learning papers and trying to infer which new algorithms might be useful for HEP, then c...

  8. Bioinspired Architecture Selection for Multitask Learning

    Directory of Open Access Journals (Sweden)

    Andrés Bueno-Crespo

    2017-06-01

    Full Text Available Faced with a new concept to learn, our brain does not work in isolation. It uses all previously learned knowledge. In addition, the brain is able to isolate the knowledge that does not benefit us, and to use what is actually useful. In machine learning, we do not usually benefit from the knowledge of other learned tasks. However, there is a methodology called Multitask Learning (MTL, which is based on the idea that learning a task along with other related tasks produces a transfer of information between them, what can be advantageous for learning the first one. This paper presents a new method to completely design MTL architectures, by including the selection of the most helpful subtasks for the learning of the main task, and the optimal network connections. In this sense, the proposed method realizes a complete design of the MTL schemes. The method is simple and uses the advantages of the Extreme Learning Machine to automatically design a MTL machine, eliminating those factors that hinder, or do not benefit, the learning process of the main task. This architecture is unique and it is obtained without testing/error methodologies that increase the computational complexity. The results obtained over several real problems show the good performances of the designed networks with this method.

  9. Stirling machine operating experience

    Energy Technology Data Exchange (ETDEWEB)

    Ross, B. [Stirling Technology Co., Richland, WA (United States); Dudenhoefer, J.E. [Lewis Research Center, Cleveland, OH (United States)

    1994-09-01

    Numerous Stirling machines have been built and operated, but the operating experience of these machines is not well known. It is important to examine this operating experience in detail, because it largely substantiates the claim that stirling machines are capable of reliable and lengthy operating lives. The amount of data that exists is impressive, considering that many of the machines that have been built are developmental machines intended to show proof of concept, and are not expected to operate for lengthy periods of time. Some Stirling machines (typically free-piston machines) achieve long life through non-contact bearings, while other Stirling machines (typically kinematic) have achieved long operating lives through regular seal and bearing replacements. In addition to engine and system testing, life testing of critical components is also considered. The record in this paper is not complete, due to the reluctance of some organizations to release operational data and because several organizations were not contacted. The authors intend to repeat this assessment in three years, hoping for even greater participation.

  10. Perpetual Motion Machine

    Directory of Open Access Journals (Sweden)

    D. Tsaousis

    2008-01-01

    Full Text Available Ever since the first century A.D. there have been relative descriptions of known devices as well as manufactures for the creation of perpetual motion machines. Although physics has led, with two thermodynamic laws, to the opinion that a perpetual motion machine is impossible to be manufactured, inventors of every age and educational level appear to claim that they have invented something «entirely new» or they have improved somebody else’s invention, which «will function henceforth perpetually»! However the fact of the failure in manufacturing a perpetual motion machine till now, it does not mean that countless historical elements for these fictional machines become indifferent. The discussion on every version of a perpetual motion machine on the one hand gives the chance to comprehend the inventor’s of each period level of knowledge and his way of thinking, and on the other hand, to locate the points where this «perpetual motion machine» clashes with the laws of nature and that’s why it is impossible to have been manufactured or have functioned. The presentation of a new «perpetual motion machine» has excited our interest to locate its weak points. According to the designer of it the machine functions with the work produced by the buoyant force

  11. Machine Intelligence and Explication

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1987-01-01

    This report is an MA ("doctoraal") thesis submitted to the department of philosophy, university of Amsterdam. It attempts to answer the question whether machines can think by conceptual analysis. Ideally. a conceptual analysis should give plausible explications of the concepts of "machine" and "inte

  12. Microsoft Azure machine learning

    CERN Document Server

    Mund, Sumit

    2015-01-01

    The book is intended for those who want to learn how to use Azure Machine Learning. Perhaps you already know a bit about Machine Learning, but have never used ML Studio in Azure; or perhaps you are an absolute newbie. In either case, this book will get you up-and-running quickly.

  13. Reactive Turing machines

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, B.; Tilburg, P.J.A. van

    2013-01-01

    We propose reactive Turing machines (RTMs), extending classical Turing machines with a process-theoretical notion of interaction, and use it to define a notion of executable transition system. We show that every computable transition system with a bounded branching degree is simulated modulo diverge

  14. Machine Intelligence and Explication

    NARCIS (Netherlands)

    Wieringa, Roel

    1987-01-01

    This report is an MA ("doctoraal") thesis submitted to the department of philosophy, university of Amsterdam. It attempts to answer the question whether machines can think by conceptual analysis. Ideally. a conceptual analysis should give plausible explications of the concepts of "machine" and "inte

  15. Coordinate measuring machines

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with three exercises of 2 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measuring: 1) Measuring and verification of tolerances on coordinate measuring machines, 2) Traceability...

  16. Simple Machine Junk Cars

    Science.gov (United States)

    Herald, Christine

    2010-01-01

    During the month of May, the author's eighth-grade physical science students study the six simple machines through hands-on activities, reading assignments, videos, and notes. At the end of the month, they can easily identify the six types of simple machine: inclined plane, wheel and axle, pulley, screw, wedge, and lever. To conclude this unit,…

  17. Human Machine Learning Symbiosis

    Science.gov (United States)

    Walsh, Kenneth R.; Hoque, Md Tamjidul; Williams, Kim H.

    2017-01-01

    Human Machine Learning Symbiosis is a cooperative system where both the human learner and the machine learner learn from each other to create an effective and efficient learning environment adapted to the needs of the human learner. Such a system can be used in online learning modules so that the modules adapt to each learner's learning state both…

  18. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2015-01-01

    Perhaps you already know a bit about machine learning but have never used R, or perhaps you know a little R but are new to machine learning. In either case, this book will get you up and running quickly. It would be helpful to have a bit of familiarity with basic programming concepts, but no prior experience is required.

  19. Statistical inference via fiducial methods

    NARCIS (Netherlands)

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  20. On principles of inductive inference

    OpenAIRE

    Kostecki, Ryszard Paweł

    2011-01-01

    We propose an intersubjective epistemic approach to foundations of probability theory and statistical inference, based on relative entropy and category theory, and aimed to bypass the mathematical and conceptual problems of existing foundational approaches.

  1. Cognitive Architectures and Autonomy: A Comparative Review

    Science.gov (United States)

    Thórisson, Kristinn; Helgasson, Helgi

    2012-05-01

    One of the original goals of artificial intelligence (AI) research was to create machines with very general cognitive capabilities and a relatively high level of autonomy. It has taken the field longer than many had expected to achieve even a fraction of this goal; the community has focused on building specific, targeted cognitive processes in isolation, and as of yet no system exists that integrates a broad range of capabilities or presents a general solution to autonomous acquisition of a large set of skills. Among the reasons for this are the highly limited machine learning and adaptation techniques available, and the inherent complexity of integrating numerous cognitive and learning capabilities in a coherent architecture. In this paper we review selected systems and architectures built expressly to address integrated skills. We highlight principles and features of these systems that seem promising for creating generally intelligent systems with some level of autonomy, and discuss them in the context of the development of future cognitive architectures. Autonomy is a key property for any system to be considered generally intelligent, in our view; we use this concept as an organizing principle for comparing the reviewed systems. Features that remain largely unaddressed in present research, but seem nevertheless necessary for such efforts to succeed, are also discussed.

  2. BIOLOGICAL NANOROBOT ARCHITECTURE FOR MEDICAL TARGET IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    S. Paul and Dipti*

    2012-07-01

    Full Text Available This work has an innovative approach for the development of biological nanorobots with sensors for medicine. The biological nanorobots operate in a virtual environment based on random, thermal and chemical control techniques. The biological nanorobot architecture model has biological nano bioelectronics as the basis for manufacturing integrated system devices with embedded biological nano biosensors and actuators, which facilitates its application for medical target identification and drug delivery. The biological nanorobot interaction with the described workspace shows how these biological nanorobots detect the target area and supply the drug. Therefore, our work addresses the control and the architecture design for developing practical molecular machines. Advances in nanotechnology are enabling manufacturing nanosensors and actuators through nano bioelectronics and biologically inspired devices. Analysis of integrated system modeling is one important aspect for supporting nanotechnology in the fast development towards one of the most challenging new fields of science: molecular machines. The use of 3D simulation can provide interactive tools for addressing nanorobot choices on sensing, hardware architecture design, manufacturing approaches, and control methodology investigation.

  3. Type Inference for Guarded Recursive Data Types

    OpenAIRE

    Stuckey, Peter J.; Sulzmann, Martin

    2005-01-01

    We consider type inference for guarded recursive data types (GRDTs) -- a recent generalization of algebraic data types. We reduce type inference for GRDTs to unification under a mixed prefix. Thus, we obtain efficient type inference. Inference is incomplete because the set of type constraints allowed to appear in the type system is only a subset of those type constraints generated by type inference. Hence, inference only succeeds if the program is sufficiently type annotated. We present refin...

  4. 15 CFR 700.31 - Metalworking machines.

    Science.gov (United States)

    2010-01-01

    ... Drilling and tapping machines Electrical discharge, ultrasonic and chemical erosion machines Forging..., power driven Machining centers and way-type machines Manual presses Mechanical presses, power...

  5. Statistical Inference in Graphical Models

    Science.gov (United States)

    2008-06-17

    Probabilistic Network Library ( PNL ). While not fully mature, PNL does provide the most commonly-used algorithms for inference and learning with the efficiency...of C++, and also offers interfaces for calling the library from MATLAB and R 1361. Notably, both BNT and PNL provide learning and inference algorithms...mature and has been used for research purposes for several years, it is written in MATLAB and thus is not suitable to be used in real-time settings. PNL

  6. Implementing Deep Inference in Tom

    OpenAIRE

    Kahramanogullari, Ozan; Moreau, Pierre-Etienne; Reilles, Antoine

    2005-01-01

    ISSN 1430-211X; The calculus of structures is a proof theoretical formalism which generalizes sequent calculus with the feature of deep inference: in contrast to sequent calculus, the calculus of structures does not rely on the notion of main connective and, like in term rewriting, it permits the application of the inference rules at any depth inside a formula. Tom is a pattern matching processor that integrates term rewriting facilities into imperative languages. In this paper, relying on th...

  7. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framewor......-accelerated primitives specializes iLang to the spatial data-structures that arise in imaging applications. We illustrate the framework through a challenging application: spatio-temporal tomographic reconstruction with compressive sensing....

  8. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  9. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  10. LHC Report: machine development

    CERN Multimedia

    Rogelio Tomás García for the LHC team

    2015-01-01

    Machine development weeks are carefully planned in the LHC operation schedule to optimise and further study the performance of the machine. The first machine development session of Run 2 ended on Saturday, 25 July. Despite various hiccoughs, it allowed the operators to make great strides towards improving the long-term performance of the LHC.   The main goals of this first machine development (MD) week were to determine the minimum beam-spot size at the interaction points given existing optics and collimation constraints; to test new beam instrumentation; to evaluate the effectiveness of performing part of the beam-squeezing process during the energy ramp; and to explore the limits on the number of protons per bunch arising from the electromagnetic interactions with the accelerator environment and the other beam. Unfortunately, a series of events reduced the machine availability for studies to about 50%. The most critical issue was the recurrent trip of a sextupolar corrector circuit –...

  11. Micro-machining.

    Science.gov (United States)

    Brinksmeier, Ekkard; Preuss, Werner

    2012-08-28

    Manipulating bulk material at the atomic level is considered to be the domain of physics, chemistry and nanotechnology. However, precision engineering, especially micro-machining, has become a powerful tool for controlling the surface properties and sub-surface integrity of the optical, electronic and mechanical functional parts in a regime where continuum mechanics is left behind and the quantum nature of matter comes into play. The surprising subtlety of micro-machining results from the extraordinary precision of tools, machines and controls expanding into the nanometre range-a hundred times more precise than the wavelength of light. In this paper, we will outline the development of precision engineering, highlight modern achievements of ultra-precision machining and discuss the necessity of a deeper physical understanding of micro-machining.

  12. Introduction to machine learning.

    Science.gov (United States)

    Baştanlar, Yalin; Ozuysal, Mustafa

    2014-01-01

    The machine learning field, which can be briefly defined as enabling computers make successful predictions using past experiences, has exhibited an impressive development recently with the help of the rapid increase in the storage capacity and processing power of computers. Together with many other disciplines, machine learning methods have been widely employed in bioinformatics. The difficulties and cost of biological analyses have led to the development of sophisticated machine learning approaches for this application area. In this chapter, we first review the fundamental concepts of machine learning such as feature assessment, unsupervised versus supervised learning and types of classification. Then, we point out the main issues of designing machine learning experiments and their performance evaluation. Finally, we introduce some supervised learning methods.

  13. Man-machine collaboration using facial expressions

    Science.gov (United States)

    Dai, Ying; Katahera, S.; Cai, D.

    2002-09-01

    For realizing the flexible man-machine collaboration, understanding of facial expressions and gestures is not negligible. In our method, we proposed a hierarchical recognition approach, for the understanding of human emotions. According to this method, the facial AFs (action features) were firstly extracted and recognized by using histograms of optical flow. Then, based on the facial AFs, facial expressions were classified into two calsses, one of which presents the positive emotions, and the other of which does the negative ones. Accordingly, the facial expressions belonged to the positive class, or the ones belonged to the negative class, were classified into more complex emotions, which were revealed by the corresponding facial expressions. Finally, the system architecture how to coordinate in recognizing facil action features and facial expressions for man-machine collaboration was proposed.

  14. Remote online machine fault diagnostic system

    Science.gov (United States)

    Pan, Min-Chun; Li, Po-Ching

    2004-07-01

    The study aims at implementing a remote online machine fault diagnostic system built up in the architecture of both the BCB software-developing environment and Internet transmission communication. Variant signal-processing computation schemes for signal analysis and pattern recognition purposes are implemented in the BCB graphical user interface. Hence, machine fault diagnostic capability can be extended by using the socket application program interface as the TCP/IP protocol. In the study, the effectiveness of the developed remote diagnostic system is validated by monitoring a transmission-element test rig. A complete monitoring cycle includes data acquisition, signal processing, feature extraction, pattern recognition through the ANNs, and online video monitoring, is demonstrated.

  15. Parameter Identifiability in Statistical Machine Learning: A Review.

    Science.gov (United States)

    Ran, Zhi-Yong; Hu, Bao-Gang

    2017-05-01

    This review examines the relevance of parameter identifiability for statistical models used in machine learning. In addition to defining main concepts, we address several issues of identifiability closely related to machine learning, showing the advantages and disadvantages of state-of-the-art research and demonstrating recent progress. First, we review criteria for determining the parameter structure of models from the literature. This has three related issues: parameter identifiability, parameter redundancy, and reparameterization. Second, we review the deep influence of identifiability on various aspects of machine learning from theoretical and application viewpoints. In addition to illustrating the utility and influence of identifiability, we emphasize the interplay among identifiability theory, machine learning, mathematical statistics, information theory, optimization theory, information geometry, Riemann geometry, symbolic computation, Bayesian inference, algebraic geometry, and others. Finally, we present a new perspective together with the associated challenges.

  16. Abductive inference and delusional belief.

    Science.gov (United States)

    Coltheart, Max; Menzies, Peter; Sutton, John

    2010-01-01

    Delusional beliefs have sometimes been considered as rational inferences from abnormal experiences. We explore this idea in more detail, making the following points. First, the abnormalities of cognition that initially prompt the entertaining of a delusional belief are not always conscious and since we prefer to restrict the term "experience" to consciousness we refer to "abnormal data" rather than "abnormal experience". Second, we argue that in relation to many delusions (we consider seven) one can clearly identify what the abnormal cognitive data are which prompted the delusion and what the neuropsychological impairment is which is responsible for the occurrence of these data; but one can equally clearly point to cases where this impairment is present but delusion is not. So the impairment is not sufficient for delusion to occur: a second cognitive impairment, one that affects the ability to evaluate beliefs, must also be present. Third (and this is the main thrust of our paper), we consider in detail what the nature of the inference is that leads from the abnormal data to the belief. This is not deductive inference and it is not inference by enumerative induction; it is abductive inference. We offer a Bayesian account of abductive inference and apply it to the explanation of delusional belief.

  17. Active inference, communication and hermeneutics.

    Science.gov (United States)

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Concurrent Process Planning for Machined Parts

    Institute of Scientific and Technical Information of China (English)

    吴丹; 王先逵; 李志忠

    2002-01-01

    Detailed manufacturing information about the parts can help designers produce better designs. Detailed manufacturing information is conveyed to the designer through micro-circles within the concurrent design process for machined parts, focusing on instantaneous product design and process planning. The process has three key elements: a hierarchical architecture design of the concurrent process planning system, modeling and reengineering of the concurrent process planning, and modeling of information. The approach is successfully implemented and applied for concurrent design and process planning of some complicated parts.

  19. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise...... architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  20. Architecture and Stages

    DEFF Research Database (Denmark)

    Kiib, Hans

    2009-01-01

    Architecture and Art as Fuel New development zones for shopping and entertainment and space for festivals inside the city CAN be coupled with art and architecture and become ‘open minded' public domains based on cultural exchange and mutual learning. This type of space could be labelled...... as "experiencescape" - a space between tourism, culture, learning and economy. Strategies related to these challenges involve new architectural concepts and art as ‘engines' for a change. New expressive architecture and old industrial buildings are often combined into hybrid narratives, linking the past...... with the future. But this is not enough. The agenda is to develop architectural spaces, where social interaction and learning are enhanced by art and fun. How can we develop new architectural designs in our inner cities and waterfronts where eventscapes, learning labs and temporal use are merged with everyday...

  1. Knowledge and Architectural Practice

    DEFF Research Database (Denmark)

    Verbeke, Johan

    2017-01-01

    This paper focuses on the specific knowledge residing in architectural practice. It is based on the research of 35 PhD fellows in the ADAPT-r (Architecture, Design and Art Practice Training-research) project. The ADAPT-r project innovates architectural research in combining expertise from academia...... and from practice in order to highlight and extract the specific kind of knowledge which resides and is developed in architectural practice (creative practice research). The paper will discuss three ongoing and completed PhD projects and focusses on the outcomes and their contribution to the field....... Specific to these research projects is that the researcher is within academia but stays emerged in architectural practice. The projects contribute to a better understanding of architectural practice, how it develops and what kind of knowledge is crucial. Furthermore, the paper will develop a reflection...

  2. Software Architecture Technology Initiative

    Science.gov (United States)

    2008-04-01

    2008 Carnegie Mellon University 2008 PLS March 2008 © 2008 Carnegie Mellon University Software Architecture Technology Initiative SATURN 2008...SUBTITLE Software Architecture Technology Initiative 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...SUPPLEMENTARY NOTES presented at the SEI Software Architecture Technology User Network (SATURN) Workshop, 30 Apr ? 1 May 2008, Pittsburgh, PA. 14

  3. Architecture humanitarian emergencies

    DEFF Research Database (Denmark)

    Gomez-Guillamon, Maria; Eskemose Andersen, Jørgen; Contreras, Jorge Lobos

    2013-01-01

    Introduced by scientific articles conserning architecture and human rights in light of cultures, emergencies, social equality and sustainability, democracy, economy, artistic development and science into architecture. Concluding in definition of needs for new roles, processes and education......, Architettura di Alghero in Italy, Architecture and Design of Kocaeli University in Turkey, University of Aguascalientes in Mexico, Architectura y Urbanismo of University of Chile and Escuela de Architectura of Universidad Austral in Chile....

  4. Indirect Fourier transform in the context of statistical inference.

    Science.gov (United States)

    Muthig, Michael; Prévost, Sylvain; Orglmeister, Reinhold; Gradzielski, Michael

    2016-09-01

    Inferring structural information from the intensity of a small-angle scattering (SAS) experiment is an ill-posed inverse problem. Thus, the determination of a solution is in general non-trivial. In this work, the indirect Fourier transform (IFT), which determines the pair distance distribution function from the intensity and hence yields structural information, is discussed within two different statistical inference approaches, namely a frequentist one and a Bayesian one, in order to determine a solution objectively From the frequentist approach the cross-validation method is obtained as a good practical objective function for selecting an IFT solution. Moreover, modern machine learning methods are employed to suppress oscillatory behaviour of the solution, hence extracting only meaningful features of the solution. By comparing the results yielded by the different methods presented here, the reliability of the outcome can be improved and thus the approach should enable more reliable information to be deduced from SAS experiments.

  5. Towards a Media Architecture

    DEFF Research Database (Denmark)

    Ebsen, Tobias

    2010-01-01

    This text explores the concept of media architecture as a phenomenon of visual culture that describes the use of screen-technology in new spatial configurations in practices of architecture and art. I shall argue that this phenomenon is not necessarily a revolutionary new approach, but rather...... a result of conceptual changes in both modes visual representation and in expressions of architecture. These are changes the may be described as an evolution of ideas and consequent experiments that can be traced back to changes in the history of art and the various styles and ideologies of architecture....

  6. Grid Architecture 2

    Energy Technology Data Exchange (ETDEWEB)

    Taft, Jeffrey D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  7. Towards a Media Architecture

    DEFF Research Database (Denmark)

    Ebsen, Tobias

    2010-01-01

    This text explores the concept of media architecture as a phenomenon of visual culture that describes the use of screen-technology in new spatial configurations in practices of architecture and art. I shall argue that this phenomenon is not necessarily a revolutionary new approach, but rather...... a result of conceptual changes in both modes visual representation and in expressions of architecture. These are changes the may be described as an evolution of ideas and consequent experiments that can be traced back to changes in the history of art and the various styles and ideologies of architecture....

  8. IT Architecture For Dummies

    CERN Document Server

    Hausman, Kalani Kirk

    2010-01-01

    A solid introduction to the practices, plans, and skills required for developing a smart system architecture. Information architecture combines IT skills with business skills in order to align the IT structure of an organization with the mission, goals, and objectives of its business. This friendly introduction to IT architecture walks you through the myriad issues and complex decisions that many organizations face when setting up IT systems to work in sync with business procedures. Veteran IT professional and author Kirk Hausman explains the business value behind IT architecture and provides

  9. Elements of Architecture

    DEFF Research Database (Denmark)

    Elements of Architecture explores new ways of engaging architecture in archaeology. It conceives of architecture both as the physical evidence of past societies and as existing beyond the physical environment, considering how people in the past have not just dwelled in buildings but have existed...... and affective impacts, of these material remains. The contributions in this volume investigate the way time, performance and movement, both physically and emotionally, are central aspects of understanding architectural assemblages. It is a book about the constellations of people, places and things that emerge...

  10. Accuracy Analysis and Calibration of Gantry Hybrid Machine Tool

    Institute of Scientific and Technical Information of China (English)

    唐晓强; 李铁民; 尹文生; 汪劲松

    2003-01-01

    The kinematic accuracy is a key factor in the design of parallel or hybrid machine tools. This analysis improved the accuracy of a 4-DOF (degree of freedom) gantry hybrid machine tool based on a 3-DOF planar parallel manipulator by compensating for various positioning errors. The machine tool architecture was described with the inverse kinematic solution. The control parameter error model was used to analyze the accuracy of the 3-DOF planar parallel manipulator and to develop a kinematic calibration method. The experimental results prove that the calibration method reduces the cutter nose errors from ±0.50 mm to ±0.03 mm for a horizontal movement of 600 mm by compensating for errors in the slider home position, the guide way distance and the extensible strut home position. The calibration method will be useful for similar types of parallel kinematic machines.

  11. Machine Learning and Radiology

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M.

    2012-01-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. PMID:22465077

  12. The basic anaesthesia machine.

    Science.gov (United States)

    Gurudatt, Cl

    2013-09-01

    After WTG Morton's first public demonstration in 1846 of use of ether as an anaesthetic agent, for many years anaesthesiologists did not require a machine to deliver anaesthesia to the patients. After the introduction of oxygen and nitrous oxide in the form of compressed gases in cylinders, there was a necessity for mounting these cylinders on a metal frame. This stimulated many people to attempt to construct the anaesthesia machine. HEG Boyle in the year 1917 modified the Gwathmey's machine and this became popular as Boyle anaesthesia machine. Though a lot of changes have been made for the original Boyle machine still the basic structure remains the same. All the subsequent changes which have been brought are mainly to improve the safety of the patients. Knowing the details of the basic machine will make the trainee to understand the additional improvements. It is also important for every practicing anaesthesiologist to have a thorough knowledge of the basic anaesthesia machine for safe conduct of anaesthesia.

  13. The basic anaesthesia machine

    Directory of Open Access Journals (Sweden)

    C L Gurudatt

    2013-01-01

    Full Text Available After WTG Morton′s first public demonstration in 1846 of use of ether as an anaesthetic agent, for many years anaesthesiologists did not require a machine to deliver anaesthesia to the patients. After the introduction of oxygen and nitrous oxide in the form of compressed gases in cylinders, there was a necessity for mounting these cylinders on a metal frame. This stimulated many people to attempt to construct the anaesthesia machine. HEG Boyle in the year 1917 modified the Gwathmey′s machine and this became popular as Boyle anaesthesia machine. Though a lot of changes have been made for the original Boyle machine still the basic structure remains the same. All the subsequent changes which have been brought are mainly to improve the safety of the patients. Knowing the details of the basic machine will make the trainee to understand the additional improvements. It is also important for every practicing anaesthesiologist to have a thorough knowledge of the basic anaesthesia machine for safe conduct of anaesthesia.

  14. Inferring physical properties of galaxies from their emission line spectra

    CERN Document Server

    Ucci, Graziano; Gallerani, Simona; Pallottini, Andrea

    2016-01-01

    We present a new approach based on Supervised Machine Learning (SML) algorithms to infer key physical properties of galaxies (density, metallicity, column density and ionization parameter) from their emission line spectra. We introduce a numerical code (called GAME, GAlaxy Machine learning for Emission lines) implementing this method and test it extensively. GAME delivers excellent predictive performances, especially for estimates of metallicity and column densities. We compare GAME with the most widely used diagnostics (e.g. R$_{23}$, [NII]$\\lambda$6584 / H$\\alpha$ indicators) showing that it provides much better accuracy and wider applicability range. GAME is particularly suitable for use in combination with Integral Field Unit (IFU) spectroscopy, both for rest-frame optical/UV nebular lines and far-infrared/sub-mm lines arising from Photo-Dissociation Regions. Finally, GAME can also be applied to the analysis of synthetic galaxy maps built from numerical simulations.

  15. Kernel learning at the first level of inference.

    Science.gov (United States)

    Cawley, Gavin C; Talbot, Nicola L C

    2014-05-01

    Kernel learning methods, whether Bayesian or frequentist, typically involve multiple levels of inference, with the coefficients of the kernel expansion being determined at the first level and the kernel and regularisation parameters carefully tuned at the second level, a process known as model selection. Model selection for kernel machines is commonly performed via optimisation of a suitable model selection criterion, often based on cross-validation or theoretical performance bounds. However, if there are a large number of kernel parameters, as for instance in the case of automatic relevance determination (ARD), there is a substantial risk of over-fitting the model selection criterion, resulting in poor generalisation performance. In this paper we investigate the possibility of learning the kernel, for the Least-Squares Support Vector Machine (LS-SVM) classifier, at the first level of inference, i.e. parameter optimisation. The kernel parameters and the coefficients of the kernel expansion are jointly optimised at the first level of inference, minimising a training criterion with an additional regularisation term acting on the kernel parameters. The key advantage of this approach is that the values of only two regularisation parameters need be determined in model selection, substantially alleviating the problem of over-fitting the model selection criterion. The benefits of this approach are demonstrated using a suite of synthetic and real-world binary classification benchmark problems, where kernel learning at the first level of inference is shown to be statistically superior to the conventional approach, improves on our previous work (Cawley and Talbot, 2007) and is competitive with Multiple Kernel Learning approaches, but with reduced computational expense.

  16. Part Machinability Evaluation System

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    In the early design period, estimation of the part or the whole product machinability is useful to consider the function and process request of the product at the same time so as to globally optimize the design decision. This paper presents a part machinability evaluation system, discusses the general restrictions of part machinability, and realizes the inspection of these restrictions with the relation between tool scan space and part model. During the system development, the expansibility and understandability were considered, and an independent restriction algorithm library and a general function library were set up. Additionally, the system has an interpreter and a knowledge manager.

  17. Fundamentals of machine design

    CERN Document Server

    Karaszewski, Waldemar

    2011-01-01

    A forum of researchers, educators and engineers involved in various aspects of Machine Design provided the inspiration for this collection of peer-reviewed papers. The resultant dissemination of the latest research results, and the exchange of views concerning the future research directions to be taken in this field will make the work of immense value to all those having an interest in the topics covered. The book reflects the cooperative efforts made in seeking out the best strategies for effecting improvements in the quality and the reliability of machines and machine parts and for extending

  18. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  19. Analysis of synchronous machines

    CERN Document Server

    Lipo, TA

    2012-01-01

    Analysis of Synchronous Machines, Second Edition is a thoroughly modern treatment of an old subject. Courses generally teach about synchronous machines by introducing the steady-state per phase equivalent circuit without a clear, thorough presentation of the source of this circuit representation, which is a crucial aspect. Taking a different approach, this book provides a deeper understanding of complex electromechanical drives. Focusing on the terminal rather than on the internal characteristics of machines, the book begins with the general concept of winding functions, describing the placeme

  20. Database machine performance

    Energy Technology Data Exchange (ETDEWEB)

    Cesarini, F.; Salza, S.

    1987-01-01

    This book is devoted to the important problem of database machine performance evaluation. The book presents several methodological proposals and case studies, that have been developed within an international project supported by the European Economic Community on Database Machine Evaluation Techniques and Tools in the Context of the Real Time Processing. The book gives an overall view of the modeling methodologies and the evaluation strategies that can be adopted to analyze the performance of the database machine. Moreover, it includes interesting case studies and an extensive bibliography.

  1. Virtual Machine Introspection

    Directory of Open Access Journals (Sweden)

    S C Rachana

    2014-06-01

    Full Text Available Cloud computing is an Internet-based computing solution which provides the resources in an effective manner. A very serious issue in cloud computing is security which is a major obstacle for the adoption of cloud. The most important threats of cloud computing are Multitenancy, Availability, Loss of control, Loss of Data, outside attacks, DOS attacks, malicious insiders, etc. Among many security issues in cloud, the Virtual Machine Security is one of the very serious issues. Thus, monitoring of virtual machine is essential. The paper proposes a Virtual Network Introspection [VMI] System to secure the Virtual machines from Distributed Denial of Service [DDOS] and Zombie attacks.

  2. Virtual Machine Introspection

    Directory of Open Access Journals (Sweden)

    S C Rachana

    2015-11-01

    Full Text Available Cloud computing is an Internet-based computing solution which provides the resources in an effective manner. A very serious issue in cloud computing is security which is a major obstacle for the adoption of cloud. The most important threats of cloud computing are Multitenancy, Availability, Loss of control, Loss of Data, outside attacks, DOS attacks, malicious insiders, etc. Among many security issues in cloud, the Virtual Machine Security is one of the very serious issues. Thus, monitoring of virtual machine is essential. The paper proposes a Virtual Network Introspection [VMI] System to secure the Virtual machines from Distributed Denial of Service [DDOS] and Zombie attacks.

  3. Machine Learning for Hackers

    CERN Document Server

    Conway, Drew

    2012-01-01

    If you're an experienced programmer interested in crunching data, this book will get you started with machine learning-a toolkit of algorithms that enables computers to train themselves to automate useful tasks. Authors Drew Conway and John Myles White help you understand machine learning and statistics tools through a series of hands-on case studies, instead of a traditional math-heavy presentation. Each chapter focuses on a specific problem in machine learning, such as classification, prediction, optimization, and recommendation. Using the R programming language, you'll learn how to analyz

  4. JACOS: AI-based simulation system for man-machine system behavior in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Kazuo; Yokobayashi, Masao; Tanabe, Fumiya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kawase, Katsumi [CSK Corp., Tokyo (Japan); Komiya, Akitoshi [Computer Associated Laboratory, Inc., Hitachinaka, Ibaraki (Japan)

    2001-08-01

    A prototype of a computer simulation system named JACOS (JAERI COgnitive Simulation system) has been developed at JAERI (Japan Atomic Energy Research Institute) to simulate the man-machine system behavior in which both the cognitive behavior of a human operator and the plant behavior affect each other. The objectives of this system development is to provide man-machine system analysts with detailed information on the cognitive process of an operator and the plant behavior affected by operator's actions in accidental situations of a nuclear power plant. The simulation system consists of an operator model and a plant model which are coupled dynamically. The operator model simulates an operator's cognitive behavior in accidental situations based on the decision ladder model of Rasmussen, and is implemented using the AI-techniques of the distributed cooperative inference method with the so-called blackboard architecture. Rule-based behavior is simulated using knowledge representation with If-Then type of rules. Knowledge-based behavior is simulated using knowledge representation with MFM (Multilevel Flow Modeling) and qualitative reasoning method. Cognitive characteristics of attentional narrowing, limitation of short-term memory, and knowledge recalling from long-term memory are also taken into account. The plant model of a 3-loop PWR is also developed using a best estimate thermal-hydraulic analysis code RELAP5/MOD2. This report is prepared as User's Manual for JACOS. The first chapter of this report describes both operator and plant models in detail. The second chapter includes instructive descriptions for program installation, building of a knowledge base for operator model, execution of simulation and analysis of simulation results. The examples of simulation with JACOS are shown in the third chapter. (author)

  5. Inferring Parametric Energy Consumption Functions at Different Software Levels

    DEFF Research Database (Denmark)

    Liqat, Umer; Georgiou, Kyriakos; Kerrison, Steve;

    2016-01-01

    written in the XC language running on XCore architectures, but is general enough to be applied to other languages. Experimental results show that our LLVM IR level analysis is reasonably accurate (less than 6.4% average error vs. hardware measurements) and more powerful than analysis at the ISA level...... on the input data sizes of programs. We have developed a tool for experimentation with static analysis which infers such energy functions at two levels, the instruction set architecture (ISA) and the intermediate code (LLVM IR) levels, and reflects it upwards to the higher source code level. This required...... the development of a translation from LLVM IR to an intermediate representation and its integration with existing components, a translation from ISA to the same representation, a resource analyzer, an ISA-level energy model, and a mapping from this model to LLVM IR. The approach has been applied to programs...

  6. Cognitive optical networks: architectures and techniques

    Science.gov (United States)

    Grebeshkov, Alexander Y.

    2017-04-01

    This article analyzes architectures and techniques of the optical networks with taking into account the cognitive methodology based on continuous cycle "Observe-Orient-Plan-Decide-Act-Learn" and the ability of the cognitive systems adjust itself through an adaptive process by responding to new changes in the environment. Cognitive optical network architecture includes cognitive control layer with knowledge base for control of software-configurable devices as reconfigurable optical add-drop multiplexers, flexible optical transceivers, software-defined receivers. Some techniques for cognitive optical networks as flexible-grid technology, broker-oriented technique, machine learning are examined. Software defined optical network and integration of wireless and optical networks with radio over fiber technique and fiber-wireless technique in the context of cognitive technologies are discussed.

  7. Development of Architectures for Internet Telerobotics Systems

    CERN Document Server

    Bambang, Riyanto

    2008-01-01

    This paper presents our experience in developing and implementing Internet telerobotics system. Internet telerobotics system refers to a robot system controlled and monitored remotely through the Internet. A robot manipulator with five degrees of freedom, called Mentor, is employed. Client-server architecture is chosen as a platform for our Internet telerobotics system. Three generations of telerobotics systems have evolved in this research. The first generation was based on CGI and two tiered architecture, where a client presents a Graphical User Interface to the user, and utilizes the user's data entry and actions to perform requests to robot server running on a different machine. The second generation was developed using Java. We also employ Java 3D for creating and manipulating 3D geometry of manipulator links and for constructing the structures used in rendering that geometry, resulting in 3D robot movement simulation presented to the users(clients) through their web browser. Recent development in our In...

  8. Information security considerations in open systems architectures

    Energy Technology Data Exchange (ETDEWEB)

    Klein, S.A. (Atlantic Research Corp., Rockville, MD (United States)); Menendez, J.N. (Atlantic Research Corp., Hanover, MD (United States))

    1993-02-01

    This paper is part of a series of papers invited by the IEEE POWER CONTROL CENTER WORKING GROUP concerning the changing designs of modern control centers. Papers invited by the Working Group discuss the following issues: Benefits of Openness, Criteria for Evaluating Open EMS Systems, Hardware Design, Configuration Management, Security, Project Management, Data Bases, SCADA, Inter and Intra-System Communications, and Man Machine Interfaces.'' This paper discusses information security and issues related to its achievement in open systems architectures. Beginning with a discussion of the goals of information security and their relation to open systems, the paper provides examples of the threats to electric utility computer systems and the consequences associated with these threats, presents basic countermeasures applicable to all computer systems, and discusses issues specific to open systems architectures.

  9. Algorithms versus architectures for computational chemistry

    Science.gov (United States)

    Partridge, H.; Bauschlicher, C. W., Jr.

    1986-01-01

    The algorithms employed are computationally intensive and, as a result, increased performance (both algorithmic and architectural) is required to improve accuracy and to treat larger molecular systems. Several benchmark quantum chemistry codes are examined on a variety of architectures. While these codes are only a small portion of a typical quantum chemistry library, they illustrate many of the computationally intensive kernels and data manipulation requirements of some applications. Furthermore, understanding the performance of the existing algorithm on present and proposed supercomputers serves as a guide for future programs and algorithm development. The algorithms investigated are: (1) a sparse symmetric matrix vector product; (2) a four index integral transformation; and (3) the calculation of diatomic two electron Slater integrals. The vectorization strategies are examined for these algorithms for both the Cyber 205 and Cray XMP. In addition, multiprocessor implementations of the algorithms are looked at on the Cray XMP and on the MIT static data flow machine proposed by DENNIS.

  10. Refinery burner simulation design architecture summary.

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, Guylaine M.; McDonald, Michael James; Halbgewachs, Ronald D.

    2011-10-01

    This report describes the architectural design for a high fidelity simulation of a refinery and refinery burner, including demonstrations of impacts to the refinery if errors occur during the refinery process. The refinery burner model and simulation are a part of the capabilities within the Sandia National Laboratories Virtual Control System Environment (VCSE). Three components comprise the simulation: HMIs developed with commercial SCADA software, a PLC controller, and visualization software. All of these components run on different machines. This design, documented after the simulation development, incorporates aspects not traditionally seen in an architectural design, but that were utilized in this particular demonstration development. Key to the success of this model development and presented in this report are the concepts of the multiple aspects of model design and development that must be considered to capture the necessary model representation fidelity of the physical systems.

  11. On Detailing in Contemporary Architecture

    DEFF Research Database (Denmark)

    Kristensen, Claus; Kirkegaard, Poul Henning

    2010-01-01

    / tactility can blur the meaning of the architecture and turn it into an empty statement. The present paper will outline detailing in contemporary architecture and discuss the issue with respect to architectural quality. Architectural cases considered as sublime piece of architecture will be presented...

  12. Some relations between quantum Turing machines and Turing machines

    CERN Document Server

    Sicard, A; Sicard, Andrés; Vélez, Mario

    1999-01-01

    For quantum Turing machines we present three elements: Its components, its time evolution operator and its local transition function. The components are related with deterministic Turing machines, the time evolution operator is related with reversible Turing machines and the local transition function is related with probabilistic and reversible Turing machines.

  13. A catalog of architectural primitives for modeling architectural patterns

    NARCIS (Netherlands)

    Zdun, Uwe; Avgeriou, Paris

    Architectural patterns are a fundamental aspect of the architecting process and subsequently the architectural documentation. Unfortunately, there is only poor support for modeling architectural patterns for two reasons. First, patterns describe recurring design solutions and hence do not directly

  14. Machining of hard-to-machine materials

    OpenAIRE

    2016-01-01

    Bakalářská práce se zabývá studiem obrábění těžkoobrobitelných materiálů. V první části jsou rozděleny těžkoobrobitelné materiály a následuje jejich analýza. V další části se práce zaměřuje na problematiku obrobitelnosti jednotlivých slitin. Závěrečná část práce je věnovaná experimentu, jeho statistickému zpracování a nakonec následnému vyhodnocení. This bachelor thesis studies the machining of hard-to-machine materials. The first part of the thesis considers hard-to-machine materials and ...

  15. Machine (bulk) harvest

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This is a summary of machine harvesting activities on Neal Smith National Wildlife Refuge between 1991 and 2008. Information is provided for each year about...

  16. Machine Vision Handbook

    CERN Document Server

    2012-01-01

    The automation of visual inspection is becoming more and more important in modern industry as a consistent, reliable means of judging the quality of raw materials and manufactured goods . The Machine Vision Handbook  equips the reader with the practical details required to engineer integrated mechanical-optical-electronic-software systems. Machine vision is first set in the context of basic information on light, natural vision, colour sensing and optics. The physical apparatus required for mechanized image capture – lenses, cameras, scanners and light sources – are discussed followed by detailed treatment of various image-processing methods including an introduction to the QT image processing system. QT is unique to this book, and provides an example of a practical machine vision system along with extensive libraries of useful commands, functions and images which can be implemented by the reader. The main text of the book is completed by studies of a wide variety of applications of machine vision in insp...

  17. Digitally-Driven Architecture

    Directory of Open Access Journals (Sweden)

    Henriette Bier

    2014-07-01

    Full Text Available The shift from mechanical to digital forces architects to reposition themselves: Architects generate digital information, which can be used not only in designing and fabricating building components but also in embedding behaviours into buildings. This implies that, similar to the way that industrial design and fabrication with its concepts of standardisation and serial production influenced modernist architecture, digital design and fabrication influences contemporary architecture. While standardisation focused on processes of rationalisation of form, mass-customisation as a new paradigm that replaces mass-production, addresses non-standard, complex, and flexible designs. Furthermore, knowledge about the designed object can be encoded in digital data pertaining not just to the geometry of a design but also to its physical or other behaviours within an environment. Digitally-driven architecture implies, therefore, not only digitally-designed and fabricated architecture, it also implies architecture – built form – that can be controlled, actuated, and animated by digital means.In this context, this sixth Footprint issue examines the influence of digital means as pragmatic and conceptual instruments for actuating architecture. The focus is not so much on computer-based systems for the development of architectural designs, but on architecture incorporating digital control, sens­ing, actuating, or other mechanisms that enable buildings to inter­act with their users and surroundings in real time in the real world through physical or sensory change and variation.

  18. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  19. Enterprise architecture intelligence

    NARCIS (Netherlands)

    Veneberg, R.K.M.; Iacob, Maria Eugenia; van Sinderen, Marten J.; Bodenstaff, L.; Reichert, M.U.; Rinderle-Ma, S.; Grossmann, G.

    2014-01-01

    Combining enterprise architecture and operational data is complex (especially when considering the actual ‘matching’ of data with enterprise architecture objects), and little has been written on how to do this. Therefore, in this paper we aim to fill this gap and propose a method to combine

  20. Aesthetics of sustainable architecture

    NARCIS (Netherlands)

    Lee, S.

    2011-01-01

    The purpose of this book is to reveal, explore and further the debate on the aesthetic potentials of sustainable architecture and its practice. This book opens a new area of scholarship and discourse in the design and production of sustainable architecture, one that is based in aesthetics. The chapt

  1. SMRF architecture concepts

    NARCIS (Netherlands)

    Rossum, W.L. van; Wit, J.J.M. de; Otten, M.P.G.; Huizing, A.G.

    2011-01-01

    This presents three valuable applications of scalable multifunction RF (SMRF) systems. These systems allow radar, ESM, and communication functionality using a single front-end architecture. With the use of a novel system design tool, concepts for SMRF architectures for airborne, ground-based and nav

  2. Architecture and energy

    DEFF Research Database (Denmark)

    Marsh, Rob; Lauring, Michael

    2011-01-01

    Traditional low-energy architecture has not necessarily led to reduced energy consumption. A paradigm shift is proposed promoting pluralistic energy-saving strategies.......Traditional low-energy architecture has not necessarily led to reduced energy consumption. A paradigm shift is proposed promoting pluralistic energy-saving strategies....

  3. Aesthetics of sustainable architecture

    NARCIS (Netherlands)

    Lee, S.

    2011-01-01

    The purpose of this book is to reveal, explore and further the debate on the aesthetic potentials of sustainable architecture and its practice. This book opens a new area of scholarship and discourse in the design and production of sustainable architecture, one that is based in aesthetics. The chapt

  4. Teaching American Indian Architecture.

    Science.gov (United States)

    Winchell, Dick

    1991-01-01

    Reviews "Native American Architecture," by Nabokov and Easton, an encyclopedic work that examines technology, climate, social structure, economics, religion, and history in relation to house design and the "meaning" of space among tribes of nine regions. Describes this book's use in a college course on Native American architecture. (SV)

  5. Architecture and Stages

    DEFF Research Database (Denmark)

    Kiib, Hans

    2009-01-01

    as "experiencescape" - a space between tourism, culture, learning and economy. Strategies related to these challenges involve new architectural concepts and art as ‘engines' for a change. New expressive architecture and old industrial buildings are often combined into hybrid narratives, linking the past...

  6. Teaching American Indian Architecture.

    Science.gov (United States)

    Winchell, Dick

    1991-01-01

    Reviews "Native American Architecture," by Nabokov and Easton, an encyclopedic work that examines technology, climate, social structure, economics, religion, and history in relation to house design and the "meaning" of space among tribes of nine regions. Describes this book's use in a college course on Native American architecture. (SV)

  7. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  8. Emerging supercomputer architectures

    Energy Technology Data Exchange (ETDEWEB)

    Messina, P.C.

    1987-01-01

    This paper will examine the current and near future trends for commercially available high-performance computers with architectures that differ from the mainstream ''supercomputer'' systems in use for the last few years. These emerging supercomputer architectures are just beginning to have an impact on the field of high performance computing. 7 refs., 1 tab.

  9. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  10. Applying neuroscience to architecture.

    Science.gov (United States)

    Eberhard, John P

    2009-06-25

    Architectural practice and neuroscience research use our brains and minds in much the same way. However, the link between neuroscience knowledge and architectural design--with rare exceptions--has yet to be made. The concept of linking these two fields is a challenge worth considering.

  11. Key characteristics for software for open architecture controllers

    Science.gov (United States)

    Pfeffer, Lawrence E.; Tran, Hy D.

    1997-01-01

    Software development time, cost, and ease of (re)use are now among the major issues in development of advanced machines, whether for machine tools, automation systems, or process systems. Two keys to reducing development time are powerful, user-friendly development tools and software architectures that provide clean, well-documented interfaces to the various real-time functions that such machines require. Examples of essential functions are signal conditioning, servo-control, trajectory generation, calibration/registration, coordination of a synchronous events, task sequencing, communication with external systems, and user interfaces. There are a number of existing standards that can help with software development, such as the IEEE POSIX standards for operating systems and real time services; software tools to compliment these standards are beginning to see use. This paper will detail some of the existing standards, some new tools, and development activities relevant to advanced, 'smart' machines.

  12. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framework...... is composed of a set of language primitives and of an inference engine based on a message-passing system that integrates cutting-edge computational tools, including proximal algorithms and high performance Hamiltonian Markov Chain Monte Carlo techniques. A set of domain-specific highly optimized GPU......-accelerated primitives specializes iLang to the spatial data-structures that arise in imaging applications. We illustrate the framework through a challenging application: spatio-temporal tomographic reconstruction with compressive sensing....

  13. Locative inferences in medical texts.

    Science.gov (United States)

    Mayer, P S; Bailey, G H; Mayer, R J; Hillis, A; Dvoracek, J E

    1987-06-01

    Medical research relies on epidemiological studies conducted on a large set of clinical records that have been collected from physicians recording individual patient observations. These clinical records are recorded for the purpose of individual care of the patient with little consideration for their use by a biostatistician interested in studying a disease over a large population. Natural language processing of clinical records for epidemiological studies must deal with temporal, locative, and conceptual issues. This makes text understanding and data extraction of clinical records an excellent area for applied research. While much has been done in making temporal or conceptual inferences in medical texts, parallel work in locative inferences has not been done. This paper examines the locative inferences as well as the integration of temporal, locative, and conceptual issues in the clinical record understanding domain by presenting an application that utilizes two key concepts in its parsing strategy--a knowledge-based parsing strategy and a minimal lexicon.

  14. Sick, the spectroscopic inference crank

    CERN Document Server

    Casey, Andrew R

    2016-01-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives which remain severely under-utilised. The lack of reliable open-source tools for analysing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this Article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick can be used to provide a nearest-neighbour estimate of model parameters, a numerically optimised point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalise on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-di...

  15. Tests of Machine Intelligence

    CERN Document Server

    Legg, Shane

    2007-01-01

    Although the definition and measurement of intelligence is clearly of fundamental importance to the field of artificial intelligence, no general survey of definitions and tests of machine intelligence exists. Indeed few researchers are even aware of alternatives to the Turing test and its many derivatives. In this paper we fill this gap by providing a short survey of the many tests of machine intelligence that have been proposed.

  16. Metalworking and machining fluids

    Science.gov (United States)

    Erdemir, Ali; Sykora, Frank; Dorbeck, Mark

    2010-10-12

    Improved boron-based metal working and machining fluids. Boric acid and boron-based additives that, when mixed with certain carrier fluids, such as water, cellulose and/or cellulose derivatives, polyhydric alcohol, polyalkylene glycol, polyvinyl alcohol, starch, dextrin, in solid and/or solvated forms result in improved metalworking and machining of metallic work pieces. Fluids manufactured with boric acid or boron-based additives effectively reduce friction, prevent galling and severe wear problems on cutting and forming tools.

  17. mlpy: Machine Learning Python

    CERN Document Server

    Albanese, Davide; Merler, Stefano; Riccadonna, Samantha; Jurman, Giuseppe; Furlanello, Cesare

    2012-01-01

    mlpy is a Python Open Source Machine Learning library built on top of NumPy/SciPy and the GNU Scientific Libraries. mlpy provides a wide range of state-of-the-art machine learning methods for supervised and unsupervised problems and it is aimed at finding a reasonable compromise among modularity, maintainability, reproducibility, usability and efficiency. mlpy is multiplatform, it works with Python 2 and 3 and it is distributed under GPL3 at the website http://mlpy.fbk.eu.

  18. Human-machine interactions

    Science.gov (United States)

    Forsythe, J. Chris; Xavier, Patrick G.; Abbott, Robert G.; Brannon, Nathan G.; Bernard, Michael L.; Speed, Ann E.

    2009-04-28

    Digital technology utilizing a cognitive model based on human naturalistic decision-making processes, including pattern recognition and episodic memory, can reduce the dependency of human-machine interactions on the abilities of a human user and can enable a machine to more closely emulate human-like responses. Such a cognitive model can enable digital technology to use cognitive capacities fundamental to human-like communication and cooperation to interact with humans.

  19. Machine Learning with Distances

    Science.gov (United States)

    2015-02-16

    and demonstrated their usefulness in experiments. 1 Introduction The goal of machine learning is to find useful knowledge behind data. Many machine...212, 172]. However, direct divergence approximators still suffer from the curse of dimensionality. A possible cure for this problem is to combine them...obtain the global optimal solution or even a good local solution without any prior knowledge . For this reason, we decided to introduce the unit-norm

  20. mlpy: Machine Learning Python

    OpenAIRE

    Albanese, Davide; Visintainer, Roberto; Merler, Stefano; Riccadonna, Samantha; Jurman, Giuseppe; Furlanello, Cesare

    2012-01-01

    mlpy is a Python Open Source Machine Learning library built on top of NumPy/SciPy and the GNU Scientific Libraries. mlpy provides a wide range of state-of-the-art machine learning methods for supervised and unsupervised problems and it is aimed at finding a reasonable compromise among modularity, maintainability, reproducibility, usability and efficiency. mlpy is multiplatform, it works with Python 2 and 3 and it is distributed under GPL3 at the website http://mlpy.fbk.eu.

  1. Product Architecture Modularity Strategies

    DEFF Research Database (Denmark)

    Mikkola, Juliana Hsuan

    2003-01-01

    The focus of this paper is to integrate various perspectives on product architecture modularity into a general framework, and also to propose a way to measure the degree of modularization embedded in product architectures. Various trade-offs between modular and integral product architectures...... and how components and interfaces influence the degree of modularization are considered. In order to gain a better understanding of product architecture modularity as a strategy, a theoretical framework and propositions are drawn from various academic literature sources. Based on the literature review......, the following key elements of product architecture are identified: components (standard and new-to-the-firm), interfaces (standardization and specification), degree of coupling, and substitutability. A mathematical function, termed modularization function, is introduced to measure the degree of modularization...

  2. Architecture as Ars Combinatoria

    Directory of Open Access Journals (Sweden)

    Francesco Cacciatore

    2015-11-01

    Full Text Available Architecture today, no longer reflects the limits of experimentation and innovation. An extremely low-tech art whose field of possibilities consists of choosing and using that which has already been done in the past. It is because of this that the key skill a designer should have is the ability to recognize the complexity in forms around them: cities have the widest programming opportunities for architecture, in the two-way relationship that is established from the city that takes form in architecture and an architecture that finds its place in the contemporary city. Thus, architecture has its foundation in hospitality, an element that differentiates it significantly from the design.

  3. Architecture as liminal Space

    Directory of Open Access Journals (Sweden)

    Nilly Harag

    2015-11-01

    Full Text Available The point of departure of the architectural project has to stem from the combination of inner and outer journeys in between the real or imagined limits. The pressing challenge is to destabilize the neat division of architecture into separate bodies of knowledge and pose the architect’s mode of action on the threshold between the concrete and the universal. Architecture is a lens, an instrument one looks through to bring new perspectives into focus, enabling the transformation of experience from a magnified self-concentrated space to a wide horizon. Architecture narrates relations between spaces and examines its validity through signifying practices of design. Design for itself becomes the language of the current, of the immediate fashion. Architecture can fulfill peoples’ dreams and miraculously can provide them tools to invent new ones: Curiosity is the first motive to act.

  4. Can architecture be barbaric?

    Science.gov (United States)

    Hürol, Yonca

    2009-06-01

    The title of this article is adapted from Theodor W. Adorno's famous dictum: 'To write poetry after Auschwitz is barbaric.' After the catastrophic earthquake in Kocaeli, Turkey on the 17th of August 1999, in which more than 40,000 people died or were lost, Necdet Teymur, who was then the dean of the Faculty of Architecture of the Middle East Technical University, referred to Adorno in one of his 'earthquake poems' and asked: 'Is architecture possible after 17th of August?' The main objective of this article is to interpret Teymur's question in respect of its connection to Adorno's philosophy with a view to make a contribution to the politics and ethics of architecture in Turkey. Teymur's question helps in providing a new interpretation of a critical approach to architecture and architectural technology through Adorno's philosophy. The paper also presents a discussion of Adorno's dictum, which serves for a better understanding of its universality/particularity.

  5. Aesthetic quality inference for online fashion shopping

    Science.gov (United States)

    Chen, Ming; Allebach, Jan

    2014-03-01

    On-line fashion communities in which participants post photos of personal fashion items for viewing and possible purchase by others are becoming increasingly popular. Generally, these photos are taken by individuals who have no training in photography with low-cost mobile phone cameras. It is desired that photos of the products have high aesthetic quality to improve the users' online shopping experience. In this work, we design features for aesthetic quality inference in the context of online fashion shopping. Psychophysical experiments are conducted to construct a database of the photos' aesthetic evaluation, specifically for photos from an online fashion shopping website. We then extract both generic low-level features and high-level image attributes to represent the aesthetic quality. Using a support vector machine framework, we train a predictor of the aesthetic quality rating based on the feature vector. Experimental results validate the efficacy of our approach. Metadata such as the product type are also used to further improve the result.

  6. Minimalism in architecture: Abstract conceptualization of architecture

    Directory of Open Access Journals (Sweden)

    Vasilski Dragana

    2015-01-01

    Full Text Available Minimalism in architecture contains the idea of the minimum as a leading creative tend to be considered and interpreted in working through phenomena of empathy and abstraction. In the Western culture, the root of this idea is found in empathy of Wilhelm Worringer and abstraction of Kasimir Malevich. In his dissertation, 'Abstraction and Empathy' Worringer presented his thesis on the psychology of style through which he explained the two opposing basic forms: abstraction and empathy. His conclusion on empathy as a psychological basis of observation expression is significant due to the verbal congruence with contemporary minimalist expression. His intuition was enhenced furthermore by figure of Malevich. Abstraction, as an expression of inner unfettered inspiration, has played a crucial role in the development of modern art and architecture of the twentieth century. Abstraction, which is one of the basic methods of learning in psychology (separating relevant from irrelevant features, Carl Jung is used to discover ideas. Minimalism in architecture emphasizes the level of abstraction to which the individual functions are reduced. Different types of abstraction are present: in the form as well as function of the basic elements: walls and windows. The case study is an example of Sou Fujimoto who is unequivocal in its commitment to the autonomy of abstract conceptualization of architecture.

  7. Eight challenges in phylodynamic inference

    Directory of Open Access Journals (Sweden)

    Simon D.W. Frost

    2015-03-01

    Full Text Available The field of phylodynamics, which attempts to enhance our understanding of infectious disease dynamics using pathogen phylogenies, has made great strides in the past decade. Basic epidemiological and evolutionary models are now well characterized with inferential frameworks in place. However, significant challenges remain in extending phylodynamic inference to more complex systems. These challenges include accounting for evolutionary complexities such as changing mutation rates, selection, reassortment, and recombination, as well as epidemiological complexities such as stochastic population dynamics, host population structure, and different patterns at the within-host and between-host scales. An additional challenge exists in making efficient inferences from an ever increasing corpus of sequence data.

  8. Automatic Inference of DATR Theories

    CERN Document Server

    Barg, P

    1996-01-01

    This paper presents an approach for the automatic acquisition of linguistic knowledge from unstructured data. The acquired knowledge is represented in the lexical knowledge representation language DATR. A set of transformation rules that establish inheritance relationships and a default-inference algorithm make up the basis components of the system. Since the overall approach is not restricted to a special domain, the heuristic inference strategy uses criteria to evaluate the quality of a DATR theory, where different domains may require different criteria. The system is applied to the linguistic learning task of German noun inflection.

  9. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  10. Object-Oriented Type Inference

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1991-01-01

    We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op-timizing......We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op...

  11. Parallel processor simulator for multiple optic channel architectures

    Science.gov (United States)

    Wailes, Tom S.; Meyer, David G.

    1992-12-01

    A parallel processing architecture based on multiple channel optical communication is described and compared with existing interconnection strategies for parallel computers. The proposed multiple channel architecture (MCA) uses MQW-DBR lasers to provide a large number of independent, selectable channels (or virtual buses) for data transport. Arbitrary interconnection patterns as well as machine partitions can be emulated via appropriate channel assignments. Hierarchies of parallel architectures and simultaneous execution of parallel tasks are also possible. Described are a basic overview of the proposed architecture, various channel allocation strategies that can be utilized by the MCA, and a summary of advantages of the MCA compared with traditional interconnection techniques. Also describes is a comprehensive multiple processor simulator that has been developed to execute parallel algorithms using the MCA as a data transport mechanism between processors and memory units. Simulation results -- including average channel load, effective channel utilization, and average network latency for different algorithms and different transmission speeds -- are also presented.

  12. Hybrid Neural Network Architecture for On-Line Learning

    CERN Document Server

    Chen, Yuhua; Wang, Lei

    2008-01-01

    Approaches to machine intelligence based on brain models have stressed the use of neural networks for generalization. Here we propose the use of a hybrid neural network architecture that uses two kind of neural networks simultaneously: (i) a surface learning agent that quickly adapt to new modes of operation; and, (ii) a deep learning agent that is very accurate within a specific regime of operation. The two networks of the hybrid architecture perform complementary functions that improve the overall performance. The performance of the hybrid architecture has been compared with that of back-propagation perceptrons and the CC and FC networks for chaotic time-series prediction, the CATS benchmark test, and smooth function approximation. It has been shown that the hybrid architecture provides a superior performance based on the RMS error criterion.

  13. Novel cascade FPGA accelerator for support vector machines classification.

    Science.gov (United States)

    Papadonikolakis, Markos; Bouganis, Christos-Savvas

    2012-07-01

    Support vector machines (SVMs) are a powerful machine learning tool, providing state-of-the-art accuracy to many classification problems. However, SVM classification is a computationally complex task, suffering from linear dependencies on the number of the support vectors and the problem's dimensionality. This paper presents a fully scalable field programmable gate array (FPGA) architecture for the acceleration of SVM classification, which exploits the device heterogeneity and the dynamic range diversities among the dataset attributes. An adaptive and fully-customized processing unit is proposed, which utilizes the available heterogeneous resources of a modern FPGA device in efficient way with respect to the problem's characteristics. The implementation results demonstrate the efficiency of the heterogeneous architecture, presenting a speed-up factor of 2-3 orders of magnitude, compared to the CPU implementation. The proposed architecture outperforms other proposed FPGA and graphic processor unit approaches by more than seven times. Furthermore, based on the special properties of the heterogeneous architecture, this paper introduces the first FPGA-oriented cascade SVM classifier scheme, which exploits the FPGA reconfigurability and intensifies the custom-arithmetic properties of the heterogeneous architecture. The results show that the proposed cascade scheme is able to increase the heterogeneous classifier throughput even further, without introducing any penalty on the resource utilization.

  14. EVALUATION OF MACHINE TOOL QUALITY

    Directory of Open Access Journals (Sweden)

    Ivan Kuric

    2011-12-01

    Full Text Available Paper deals with aspects of quality and accuracy of machine tools. As the accuracy of machine tools has key factor for product quality, it is important to know the methods for evaluation of quality and accuracy of machine tools. Several aspects of diagnostics of machine tools are described, such as aspects of reliability.

  15. An HTS machine laboratory prototype

    DEFF Research Database (Denmark)

    Mijatovic, Nenad; Jensen, Bogi Bech; Træholt, Chresten

    2012-01-01

    This paper describes Superwind HTS machine laboratory setup which is a small scale HTS machine designed and build as a part of the efforts to identify and tackle some of the challenges the HTS machine design may face. One of the challenges of HTS machines is a Torque Transfer Element (TTE) which...

  16. Assessing Implicit Knowledge in BIM Models with Machine Learning

    DEFF Research Database (Denmark)

    Krijnen, Thomas; Tamke, Martin

    2015-01-01

    architects and engineers are able to deduce non-explicitly explicitly stated information, which is often the core of the transported architectural information. This paper investigates how machine learning approaches allow a computational system to deduce implicit knowledge from a set of BIM models.......The promise, which comes along with Building Information Models, is that they are information rich, machine readable and represent the insights of multiple building disciplines within single or linked models. However, this knowledge has to be stated explicitly in order to be understood. Trained...

  17. Assessing Implicit Knowledge in BIM Models with Machine Learning

    DEFF Research Database (Denmark)

    Krijnen, Thomas; Tamke, Martin

    2015-01-01

    architects and engineers are able to deduce non-explicitly explicitly stated information, which is often the core of the transported architectural information. This paper investigates how machine learning approaches allow a computational system to deduce implicit knowledge from a set of BIM models.......The promise, which comes along with Building Information Models, is that they are information rich, machine readable and represent the insights of multiple building disciplines within single or linked models. However, this knowledge has to be stated explicitly in order to be understood. Trained...

  18. Machining of fiber reinforced composites

    Science.gov (United States)

    Komanduri, Ranga; Zhang, Bi; Vissa, Chandra M.

    Factors involved in machining of fiber-reinforced composites are reviewed. Consideration is given to properties of composites reinforced with boron filaments, glass fibers, aramid fibers, carbon fibers, and silicon carbide fibers and to polymer (organic) matrix composites, metal matrix composites, and ceramic matrix composites, as well as to the processes used in conventional machining of boron-titanium composites and of composites reinforced by each of these fibers. Particular attention is given to the methods of nonconventional machining, such as laser machining, water jet cutting, electrical discharge machining, and ultrasonic assisted machining. Also discussed are safety precautions which must be taken during machining of fiber-containing composites.

  19. Data-Centric Enterprise Architecture

    OpenAIRE

    Zeinab Rajabi; Maryam Nooraei Abade

    2012-01-01

    Enterprises choose Enterprise Architecture (EA) solution, in order to overcome dynamic business challenges and in coordinate various enterprise elements. In this article, a solution is suggested for the Enterprise Architecture development. The solution focuses on architecture data in the Enterprise Architecture development process. Data-centric architecture approach is preferred product-centric architecture approach. We suggest using Enterprise Ontology (EO) as context for collecting architec...

  20. Machine vision and the OMV

    Science.gov (United States)

    Mcanulty, M. A.

    1986-01-01

    The orbital Maneuvering Vehicle (OMV) is intended to close with orbiting targets for relocation or servicing. It will be controlled via video signals and thruster activation based upon Earth or space station directives. A human operator is squarely in the middle of the control loop for close work. Without directly addressing future, more autonomous versions of a remote servicer, several techniques that will doubtless be important in a future increase of autonomy also have some direct application to the current situation, particularly in the area of image enhancement and predictive analysis. Several techniques are presentet, and some few have been implemented, which support a machine vision capability proposed to be adequate for detection, recognition, and tracking. Once feasibly implemented, they must then be further modified to operate together in real time. This may be achieved by two courses, the use of an array processor and some initial steps toward data reduction. The methodology or adapting to a vector architecture is discussed in preliminary form, and a highly tentative rationale for data reduction at the front end is also discussed. As a by-product, a working implementation of the most advanced graphic display technique, ray-casting, is described.

  1. 化学抽象机的分析与应用研究%Analysis and Application Research on Chemical Abstract Machine

    Institute of Scientific and Technical Information of China (English)

    赵恒; 王振宇; 曹万华; 叶俊民

    2003-01-01

    This paper analyzes and studies the form and the ability of the Chemical Abstract Machine, or CHAM, ondescribing the system software architecture. After some expanding, the CHAM is applied to describe formally thesoftware architecture of command and control system. It is expected that the specification of the system requirementsand the software test plan would be automatically generated from the formal software architecture description in thelevel of software architecture.

  2. Engineered CVD Diamond Coatings for Machining and Tribological Applications

    Science.gov (United States)

    Dumpala, Ravikumar; Chandran, Maneesh; Ramachandra Rao, M. S.

    2015-07-01

    Diamond is an allotropes of carbon and is unique because of its extreme hardness (~100 GPa), low friction coefficient (fracture toughness can be tuned by controlling the grain size of the coatings from a few microns to a few nanometers. In this review, characteristics and performance of the CVD diamond coatings deposited on cemented tungsten carbide (WC-Co) substrates were discussed with an emphasis on WC-Co grade selection, substrate pretreatment, nanocrystallinity and microcrystallinity of the coating, mechanical and tribological characteristics, coating architecture, and interfacial adhesion integrity. Engineered coating substrate architecture is essential for CVD diamond coatings to perform well under harsh and highly abrasive machining and tribological conditions.

  3. Machining of Metal Matrix Composites

    CERN Document Server

    2012-01-01

    Machining of Metal Matrix Composites provides the fundamentals and recent advances in the study of machining of metal matrix composites (MMCs). Each chapter is written by an international expert in this important field of research. Machining of Metal Matrix Composites gives the reader information on machining of MMCs with a special emphasis on aluminium matrix composites. Chapter 1 provides the mechanics and modelling of chip formation for traditional machining processes. Chapter 2 is dedicated to surface integrity when machining MMCs. Chapter 3 describes the machinability aspects of MMCs. Chapter 4 contains information on traditional machining processes and Chapter 5 is dedicated to the grinding of MMCs. Chapter 6 describes the dry cutting of MMCs with SiC particulate reinforcement. Finally, Chapter 7 is dedicated to computational methods and optimization in the machining of MMCs. Machining of Metal Matrix Composites can serve as a useful reference for academics, manufacturing and materials researchers, manu...

  4. Science Driven Supercomputing Architectures: AnalyzingArchitectural Bottlenecks with Applications and Benchmark Probes

    Energy Technology Data Exchange (ETDEWEB)

    Kamil, S.; Yelick, K.; Kramer, W.T.; Oliker, L.; Shalf, J.; Shan,H.; Strohmaier, E.

    2005-09-26

    There is a growing gap between the peak speed of parallel computing systems and the actual delivered performance for scientific applications. In general this gap is caused by inadequate architectural support for the requirements of modern scientific applications, as commercial applications and the much larger market they represent, have driven the evolution of computer architectures. This gap has raised the importance of developing better benchmarking methodologies to characterize and to understand the performance requirements of scientific applications, to communicate them efficiently to influence the design of future computer architectures. This improved understanding of the performance behavior of scientific applications will allow improved performance predictions, development of adequate benchmarks for identification of hardware and application features that work well or poorly together, and a more systematic performance evaluation in procurement situations. The Berkeley Institute for Performance Studies has developed a three-level approach to evaluating the design of high end machines and the software that runs on them: (1) A suite of representative applications; (2) A set of application kernels; and (3) Benchmarks to measure key system parameters. The three levels yield different type of information, all of which are useful in evaluating systems, and enable NSF and DOE centers to select computer architectures more suited for scientific applications. The analysis will further allow the centers to engage vendors in discussion of strategies to alleviate the present architectural bottlenecks using quantitative information. These may include small hardware changes or larger ones that may be out interest to non-scientific workloads. Providing quantitative models to the vendors allows them to assess the benefits of technology alternatives using their own internal cost-models in the broader marketplace, ideally facilitating the development of future computer

  5. Carbon-Carbon Piston Architectures

    Science.gov (United States)

    Rivers, H. Kevin (Inventor); Ransone, Philip O. (Inventor); Northam, G. Burton (Inventor); Schwind, Francis A. (Inventor)

    2000-01-01

    An improved structure for carbon-carbon composite piston architectures is disclosed. The improvement consists of replacing the knitted fiber, three-dimensional piston preform architecture described in U.S. Pat.No. 4,909,133 (Taylor et al.) with a two-dimensional lay-up or molding of carbon fiber fabric or tape. Initially, the carbon fabric of tape layers are prepregged with carbonaceous organic resins and/or pitches and are laid up or molded about a mandrel, to form a carbon-fiber reinforced organic-matrix composite part shaped like a "U" channel, a "T"-bar, or a combination of the two. The molded carbon-fiber reinforced organic-matrix composite part is then pyrolized in an inert atmosphere, to convert the organic matrix materials to carbon. At this point, cylindrical piston blanks are cored from the "U"-channel, "T"-bar, or combination part. These blanks are then densified by reimpregnation with resins or pitches which are subsequently carbonized. Densification is also accomplished by direct infiltration with carbon by vapor deposition processes. Once the desired density has been achieved, the piston billets are machined to final piston dimensions; coated with oxidation sealants; and/or coated with a catalyst. When compared to conventional steel or aluminum alloy pistons, the use of carbon-carbon composite pistons reduces the overall weight of the engine; allows for operation at higher temperatures without a loss of strength; allows for quieter operation; reduces the heat loss; and reduces the level of hydrocarbon emissions.

  6. Travels in Architectural History

    Directory of Open Access Journals (Sweden)

    Davide Deriu

    2016-11-01

    Full Text Available Travel is a powerful force in shaping the perception of the modern world and plays an ever-growing role within architectural and urban cultures. Inextricably linked to political and ideological issues, travel redefines places and landscapes through new transport infrastructures and buildings. Architecture, in turn, is reconstructed through visual and textual narratives produced by scores of modern travellers — including writers and artists along with architects themselves. In the age of the camera, travel is bound up with new kinds of imaginaries; private records and recollections often mingle with official, stereotyped views, as the value of architectural heritage increasingly rests on the mechanical reproduction of its images. Whilst students often learn about architectural history through image collections, the place of the journey in the formation of the architect itself shifts. No longer a lone and passionate antiquarian or an itinerant designer, the modern architect eagerly hops on buses, trains, and planes in pursuit of personal as well as professional interests. Increasingly built on a presumption of mobility, architectural culture integrates travel into cultural debates and design experiments. By addressing such issues from a variety of perspectives, this collection, a special 'Architectural Histories' issue on travel, prompts us to rethink the mobile conditions in which architecture has historically been produced and received.

  7. Fractal Geometry of Architecture

    Science.gov (United States)

    Lorenz, Wolfgang E.

    In Fractals smaller parts and the whole are linked together. Fractals are self-similar, as those parts are, at least approximately, scaled-down copies of the rough whole. In architecture, such a concept has also been known for a long time. Not only architects of the twentieth century called for an overall idea that is mirrored in every single detail, but also Gothic cathedrals and Indian temples offer self-similarity. This study mainly focuses upon the question whether this concept of self-similarity makes architecture with fractal properties more diverse and interesting than Euclidean Modern architecture. The first part gives an introduction and explains Fractal properties in various natural and architectural objects, presenting the underlying structure by computer programmed renderings. In this connection, differences between the fractal, architectural concept and true, mathematical Fractals are worked out to become aware of limits. This is the basis for dealing with the problem whether fractal-like architecture, particularly facades, can be measured so that different designs can be compared with each other under the aspect of fractal properties. Finally the usability of the Box-Counting Method, an easy-to-use measurement method of Fractal Dimension is analyzed with regard to architecture.

  8. Avionics Architecture for Exploration Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Avionics Architectures for Exploration Project team will develop a system level environment and architecture that will accommodate equipment from multiple...

  9. Integrated Data Assimilation Architecture Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Data Assimilation Architecture (IDAA) is a middleware architecture that facilitates the incorporation of heterogeneous sensing and control devices...

  10. Non-conventional electrical machines

    CERN Document Server

    Rezzoug, Abderrezak

    2013-01-01

    The developments of electrical machines are due to the convergence of material progress, improved calculation tools, and new feeding sources. Among the many recent machines, the authors have chosen, in this first book, to relate the progress in slow speed machines, high speed machines, and superconducting machines. The first part of the book is dedicated to materials and an overview of magnetism, mechanic, and heat transfer.

  11. Inferred motion perception of light sources in 3D scenes is color-blind.

    Science.gov (United States)

    Gerhard, Holly E; Maloney, Laurence T

    2013-01-01

    In everyday scenes, the illuminant can vary spatially in chromaticity and luminance, and change over time (e.g. sunset). Such variation generates dramatic image effects too complex for any contemporary machine vision system to overcome, yet human observers are remarkably successful at inferring object properties separately from lighting, an ability linked with estimation and tracking of light field parameters. Which information does the visual system use to infer light field dynamics? Here, we specifically ask whether color contributes to inferred light source motion. Observers viewed 3D surfaces illuminated by an out-of-view moving collimated source (sun) and a diffuse source (sky). In half of the trials, the two sources differed in chromaticity, thereby providing more information about motion direction. Observers discriminated light motion direction above chance, and only the least sensitive observer benefited slightly from the added color information, suggesting that color plays only a very minor role for inferring light field dynamics.

  12. Inferred Motion Perception of Light Sources in 3D Scenes is Color-Blind

    Directory of Open Access Journals (Sweden)

    Holly E. Gerhard

    2013-04-01

    Full Text Available In everyday scenes, the illuminant can vary spatially in chromaticity and luminance, and change over time (e.g. sunset. Such variation generates dramatic image effects too complex for any contemporary machine vision system to overcome, yet human observers are remarkably successful at inferring object properties separately from lighting, an ability linked with estimation and tracking of light field parameters. Which information does the visual system use to infer light field dynamics? Here, we specifically ask whether color contributes to inferred light source motion. Observers viewed 3D surfaces illuminated by an out-of-view moving collimated source (sun and a diffuse source (sky. In half of the trials, the two sources differed in chromaticity, thereby providing more information about motion direction. Observers discriminated light motion direction above chance, and only the least sensitive observer benefited slightly from the added color information, suggesting that color plays only a very minor role for inferring light field dynamics.

  13. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  14. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  15. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  16. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  17. On principles of inductive inference

    CERN Document Server

    Kostecki, Ryszard Paweł

    2011-01-01

    We discuss the mathematical and conceptual problems of main approaches to foundations of probability theory and statistical inference and propose new foundational approach, aimed to improve the mathematical structure of the theory and to bypass the old conceptual problems. In particular, we introduce the intersubjective interpretation of probability, which is designed to deal with the troubles of `subjective' and `objective' bayesian interpretations.

  18. Regular inference as vertex coloring

    NARCIS (Netherlands)

    Costa Florêncio, C.; Verwer, S.

    2012-01-01

    This paper is concerned with the problem of supervised learning of deterministic finite state automata, in the technical sense of identification in the limit from complete data, by finding a minimal DFA consistent with the data (regular inference). We solve this problem by translating it in its enti

  19. Type inference for COBOL systems

    NARCIS (Netherlands)

    Deursen, A. van; Moonen, L.M.F.

    1998-01-01

    Types are a good starting point for various software reengineering tasks. Unfortunately, programs requiring reengineering most desperately are written in languages without an adequate type system (such as COBOL). To solve this problem, we propose a method of automated type inference for these lang

  20. Regular inference as vertex coloring

    NARCIS (Netherlands)

    Costa Florêncio, C.; Verwer, S.

    2012-01-01

    This paper is concerned with the problem of supervised learning of deterministic finite state automata, in the technical sense of identification in the limit from complete data, by finding a minimal DFA consistent with the data (regular inference). We solve this problem by translating it in its

  1. Statistical inference on variance components

    NARCIS (Netherlands)

    Verdooren, L.R.

    1988-01-01

    In several sciences but especially in animal and plant breeding, the general mixed model with fixed and random effects plays a great role. Statistical inference on variance components means tests of hypotheses about variance components, constructing confidence intervals for them, estimating them,

  2. Covering, Packing and Logical Inference

    Science.gov (United States)

    1993-10-01

    of Operations Research 43 (1993). [34] *Hooker, J. N., Generalized resolution for 0-1 linear inequalities, Annals of Mathematics and A 16 271-286. [35...Hooker, J. N. and C. Fedjki, Branch-and-cut solution of inference prob- lems in propositional logic, Annals of Mathematics and AI 1 (1990) 123-140. [40

  3. Mathematical Programming and Logical Inference

    Science.gov (United States)

    1990-12-01

    solution of inference problems in propositional logic, to appear in Annals of Mathematics and Al. (271 Howard, R. A., and J. E. Matheson, Influence...1981). (281 Jeroslow, R., and J. Wang, Solving propositional satisfiability problems, to appear in Annals of Mathematics and Al. [29] Nilsson, N. J

  4. An Introduction to Causal Inference

    Science.gov (United States)

    2009-11-02

    legitimize causal inference, has removed causation from its natural habitat, and distorted its face beyond recognition. This exclusivist attitude is...In contrast, when the mediation problem is approached from an exclusivist potential-outcome viewpoint, void of the structural guidance of Eq. (28

  5. Semantic Architecture for Web application Security

    Directory of Open Access Journals (Sweden)

    Abdul Razzaq

    2012-03-01

    Full Text Available Growth of web applications has facilitated the humanity almost in all aspects of life especially e-health, e-business and e-communication but this application are exposed for web attacks, unauthorized access, evil intentions and treacherous engagements. Various strategies have been formulated over a period of time in the form of intrusion detection system, encryption devices, and firewalls but still proved to be ineffective. In this paper, we have proposed a system having semantic architecture that is capable of performing detection semantically in the context of HTTP protocol, the data, and the target application. The knowledgebase of the system is the ontological representation of communication protocol, attacks data and the application profile that can be refined and expanded over time. Unlike traditional signature base approach, the semantic architecture analysis the HTTP request with the help of semantic rules and inferred knowledge after reasoning of knowledgebase through Inference engine. Non signature based approach of the system enhance the capability of the system to detect the unknown attacks with low false positive rate. The system is evaluated by comparing with existing open source solutions and showing significant improvement in term of detection ability with low alarm rate

  6. Towards Adaptive Evolutionary Architecture

    DEFF Research Database (Denmark)

    Bak, Sebastian HOlt; Rask, Nina; Risi, Sebastian

    2016-01-01

    This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how...... to the development of designs tailored to the individual preferences of inhabitants, changing the roles of architects and designers entirely. Architecture-as-it-could-be is a philosophical approach conducted through artistic methods to anticipate the technological futures of human-centered development within...

  7. ARCHITECTURE INFORMS HISTORY

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Clusters of ancient architecture in central China have recently been entered on the world heritage list A group of ancient architecture in Dengfeng,central China’s Henan Province,was added to the world heritage list at the 34th session of the World Heritage Committee in Brazil on August 1 this year.The architectural collection is China’s 39th property inscribed on the list,and the third world heritage site in the province after the Longmen Grottoes and Yinxu in Anyang,site of the capital of the late Shang Dynasty(1600-1046 B.C.).

  8. RECONSTRUCTING DECONSTRUCTION IN ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    IDHAM Noor Cholis

    2013-12-01

    Full Text Available This paper examines deconstruction in architecture and forms used, which triggered by dispute of form and its scientification claimed by deconstructivists. Deconstruction terminologyis studied in the first part related to architecture field as a base knowledge. Some sample works of known deconstructionist architects are assessed in order to understand how their building is deconstructed and what forms they used. Discussion about form and its relation to other fields is then pursued by discussing the involvement of terminology of science, aesthetic pattern, and human life. The results of this discussion give clear understanding how form related to deconstruction in architecture are used.

  9. Computer architecture technology trends

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in

  10. Architectural Knitted Surfaces

    DEFF Research Database (Denmark)

    Mossé, Aurélie

    2010-01-01

    WGSN reports from the Architectural Knitted Surfaces workshop recently held at ShenkarCollege of Engineering and Design, Tel Aviv, which offered a cutting-edge insight into interactive knitted surfaces. With the increasing role of smart textiles in architecture, the Architectural Knitted Surfaces...... workshop brought together architects and interior and textile designers to highlight recent developments in intelligent knitting. The five-day workshop was led by architects Ayelet Karmon and Mette Ramsgaard Thomsen, together with Amir Cang and Eyal Sheffer from the Knitting Laboratory, in collaboration...... with Amir Marcowitz and Yair Reshef for their expertise in interaction design....

  11. Machinability evaluation of machinable ceramics with fuzzy theory

    Institute of Scientific and Technical Information of China (English)

    YU Ai-bing; ZHONG Li-jun; TAN Ye-fa

    2005-01-01

    The property parameters and machining output parameters were selected for machinability evaluation of machinable ceramics. Based on fuzzy evaluation theory, two-stage fuzzy evaluation approach was applied to consider these parameters. Two-stage fuzzy comprehensive evaluation model was proposed to evaluate machinability of machinable ceramic materials. Ce-ZrO2/CePO4 composites were fabricated and machined for evaluation of machinable ceramics. Material removal rates and specific normal grinding forces were measured. The parameters concerned with machinability were selected as alternative set. Five grades were chosen for the machinability evaluation of machnable ceramics. Machinability grades of machinable ceramics were determined through fuzzy operation. Ductile marks are observed on Ce-ZrO2/CePO4 machined surface. Five prepared Ce-ZrO2/CePO4 composites are classified as three machinability grades according to the fuzzy comprehensive evaluation results. The machinability grades of Ce-ZrO2/CePO4 composites are concerned with CePO4 content.

  12. Virtual Machine Language 2.1

    Science.gov (United States)

    Riedel, Joseph E.; Grasso, Christopher A.

    2012-01-01

    VML (Virtual Machine Language) is an advanced computing environment that allows spacecraft to operate using mechanisms ranging from simple, time-oriented sequencing to advanced, multicomponent reactive systems. VML has developed in four evolutionary stages. VML 0 is a core execution capability providing multi-threaded command execution, integer data types, and rudimentary branching. VML 1 added named parameterized procedures, extensive polymorphism, data typing, branching, looping issuance of commands using run-time parameters, and named global variables. VML 2 added for loops, data verification, telemetry reaction, and an open flight adaptation architecture. VML 2.1 contains major advances in control flow capabilities for executable state machines. On the resource requirements front, VML 2.1 features a reduced memory footprint in order to fit more capability into modestly sized flight processors, and endian-neutral data access for compatibility with Intel little-endian processors. Sequence packaging has been improved with object-oriented programming constructs and the use of implicit (rather than explicit) time tags on statements. Sequence event detection has been significantly enhanced with multi-variable waiting, which allows a sequence to detect and react to conditions defined by complex expressions with multiple global variables. This multi-variable waiting serves as the basis for implementing parallel rule checking, which in turn, makes possible executable state machines. The new state machine feature in VML 2.1 allows the creation of sophisticated autonomous reactive systems without the need to develop expensive flight software. Users specify named states and transitions, along with the truth conditions required, before taking transitions. Transitions with the same signal name allow separate state machines to coordinate actions: the conditions distributed across all state machines necessary to arm a particular signal are evaluated, and once found true, that

  13. MACHINE MOTION EQUATIONS

    Directory of Open Access Journals (Sweden)

    Florian Ion Tiberiu Petrescu

    2015-09-01

    Full Text Available This paper presents the dynamic, original, machine motion equations. The equation of motion of the machine that generates angular speed of the shaft (which varies with position and rotation speed is deduced by conservation kinetic energy of the machine. An additional variation of angular speed is added by multiplying by the coefficient dynamic D (generated by the forces out of mechanism and or by the forces generated by the elasticity of the system. Kinetic energy conservation shows angular speed variation (from the shaft with inertial masses, while the dynamic coefficient introduces the variation of w with forces acting in the mechanism. Deriving the first equation of motion of the machine one can obtain the second equation of motion dynamic. From the second equation of motion of the machine it determines the angular acceleration of the shaft. It shows the distribution of the forces on the mechanism to the internal combustion heat engines. Dynamic, the velocities can be distributed in the same way as forces. Practically, in the dynamic regimes, the velocities have the same timing as the forces. Calculations should be made for an engine with a single cylinder. Originally exemplification is done for a classic distribution mechanism, and then even the module B distribution mechanism of an Otto engine type.

  14. Super-computer architecture

    CERN Document Server

    Hockney, R W

    1977-01-01

    This paper examines the design of the top-of-the-range, scientific, number-crunching computers. The market for such computers is not as large as that for smaller machines, but on the other hand it is by no means negligible. The present work-horse machines in this category are the CDC 7600 and IBM 360/195, and over fifty of the former machines have been sold. The types of installation that form the market for such machines are not only the major scientific research laboratories in the major countries-such as Los Alamos, CERN, Rutherford laboratory-but also major universities or university networks. It is also true that, as with sports cars, innovations made to satisfy the top of the market today often become the standard for the medium-scale computer of tomorrow. Hence there is considerable interest in examining present developments in this area. (0 refs).

  15. Quantum Loop Topography for Machine Learning

    Science.gov (United States)

    Zhang, Yi; Kim, Eun-Ah

    2017-05-01

    Despite rapidly growing interest in harnessing machine learning in the study of quantum many-body systems, training neural networks to identify quantum phases is a nontrivial challenge. The key challenge is in efficiently extracting essential information from the many-body Hamiltonian or wave function and turning the information into an image that can be fed into a neural network. When targeting topological phases, this task becomes particularly challenging as topological phases are defined in terms of nonlocal properties. Here, we introduce quantum loop topography (QLT): a procedure of constructing a multidimensional image from the "sample" Hamiltonian or wave function by evaluating two-point operators that form loops at independent Monte Carlo steps. The loop configuration is guided by the characteristic response for defining the phase, which is Hall conductivity for the cases at hand. Feeding QLT to a fully connected neural network with a single hidden layer, we demonstrate that the architecture can be effectively trained to distinguish the Chern insulator and the fractional Chern insulator from trivial insulators with high fidelity. In addition to establishing the first case of obtaining a phase diagram with a topological quantum phase transition with machine learning, the perspective of bridging traditional condensed matter theory with machine learning will be broadly valuable.

  16. Architectures of prototypes and architectural prototyping

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Christensen, Michael; Sandvad, Elmer;

    1998-01-01

    together as a team, but developed a prototype that more than fulfilled the expectations of the shipping company. The prototype should: - complete the first major phase within 10 weeks, - be highly vertical illustrating future work practice, - continuously live up to new requirements from prototyping...... sessions with users, - evolve over a long period of time to contain more functionality - allow for 6-7 developers working intensively in parallel. Explicit focus on the software architecture and letting the architecture evolve with the prototype played a major role in resolving these conflicting......This paper reports from experience obtained through development of a prototype of a global customer service system in a project involving a large shipping company and a university research group. The research group had no previous knowledge of the complex business of shipping and had never worked...

  17. Architectures of prototypes and architectural prototyping

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Christensen, Michael; Sandvad, Elmer

    1998-01-01

    This paper reports from experience obtained through development of a prototype of a global customer service system in a project involving a large shipping company and a university research group. The research group had no previous knowledge of the complex business of shipping and had never worked...... together as a team, but developed a prototype that more than fulfilled the expectations of the shipping company. The prototype should: - complete the first major phase within 10 weeks, - be highly vertical illustrating future work practice, - continuously live up to new requirements from prototyping...... sessions with users, - evolve over a long period of time to contain more functionality - allow for 6-7 developers working intensively in parallel. Explicit focus on the software architecture and letting the architecture evolve with the prototype played a major role in resolving these conflicting...

  18. Spontaneous evaluative inferences and their relationship to spontaneous trait inferences.

    Science.gov (United States)

    Schneid, Erica D; Carlston, Donal E; Skowronski, John J

    2015-05-01

    Three experiments are reported that explore affectively based spontaneous evaluative impressions (SEIs) of stimulus persons. Experiments 1 and 2 used modified versions of the savings in relearning paradigm (Carlston & Skowronski, 1994) to confirm the occurrence of SEIs, indicating that they are equivalent whether participants are instructed to form trait impressions, evaluative impressions, or neither. These experiments also show that SEIs occur independently of explicit recall for the trait implications of the stimuli. Experiment 3 provides a single dissociation test to distinguish SEIs from spontaneous trait inferences (STIs), showing that disrupting cognitive processing interferes with a trait-based prediction task that presumably reflects STIs, but not with an affectively based social approach task that presumably reflects SEIs. Implications of these findings for the potential independence of spontaneous trait and evaluative inferences, as well as limitations and important steps for future study are discussed. (c) 2015 APA, all rights reserved).

  19. An Efficient Reconfigurable Architecture for Fingerprint Recognition

    Directory of Open Access Journals (Sweden)

    Satish S. Bhairannawar

    2016-01-01

    Full Text Available The fingerprint identification is an efficient biometric technique to authenticate human beings in real-time Big Data Analytics. In this paper, we propose an efficient Finite State Machine (FSM based reconfigurable architecture for fingerprint recognition. The fingerprint image is resized, and Compound Linear Binary Pattern (CLBP is applied on fingerprint, followed by histogram to obtain histogram CLBP features. Discrete Wavelet Transform (DWT Level 2 features are obtained by the same methodology. The novel matching score of CLBP is computed using histogram CLBP features of test image and fingerprint images in the database. Similarly, the DWT matching score is computed using DWT features of test image and fingerprint images in the database. Further, the matching scores of CLBP and DWT are fused with arithmetic equation using improvement factor. The performance parameters such as TSR (Total Success Rate, FAR (False Acceptance Rate, and FRR (False Rejection Rate are computed using fusion scores with correlation matching technique for FVC2004 DB3 Database. The proposed fusion based VLSI architecture is synthesized on Virtex xc5vlx30T-3 FPGA board using Finite State Machine resulting in optimized parameters.

  20. Analysis of Architecture Pattern Usage in Legacy System Architecture Documentation

    NARCIS (Netherlands)

    Harrison, Neil B.; Avgeriou, Paris

    2008-01-01

    Architecture patterns are an important tool in architectural design. However, while many architecture patterns have been identified, there is little in-depth understanding of their actual use in software architectures. For instance, there is no overview of how many patterns are used per system or wh