WorldWideScience

Sample records for hybrid approach computational

  1. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  2. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come for p...

  3. A generalized hybrid transfinite element computational approach for nonlinear/linear unified thermal/structural analysis

    Science.gov (United States)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1987-01-01

    The present paper describes the development of a new hybrid computational approach for applicability for nonlinear/linear thermal structural analysis. The proposed transfinite element approach is a hybrid scheme as it combines the modeling versatility of contemporary finite elements in conjunction with transform methods and the classical Bubnov-Galerkin schemes. Applicability of the proposed formulations for nonlinear analysis is also developed. Several test cases are presented to include nonlinear/linear unified thermal-stress and thermal-stress wave propagations. Comparative results validate the fundamental capablities of the proposed hybrid transfinite element methodology.

  4. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  5. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan;

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...... diagnoses. The final result is a short but robust rule based classification scheme, achieving high degree of classification accuracy (exceeding 90% of accuracy for most classes) in a meaningful and user-friendly representation form for the medical expert. The domain of application analyzed through the paper...... is the well-known Pap-Test problem, corresponding to a numerical database, which consists of 450 medical records, 25 diagnostic attributes and 5 different diagnostic classes. Experimental data are divided in two equal parts for the training and testing phase, and 8 mutually dependent rules for diagnosis...

  6. A simplified computational fluid-dynamic approach to the oxidizer injector design in hybrid rockets

    Science.gov (United States)

    Di Martino, Giuseppe D.; Malgieri, Paolo; Carmicino, Carmine; Savino, Raffaele

    2016-12-01

    Fuel regression rate in hybrid rockets is non-negligibly affected by the oxidizer injection pattern. In this paper a simplified computational approach developed in an attempt to optimize the oxidizer injector design is discussed. Numerical simulations of the thermo-fluid-dynamic field in a hybrid rocket are carried out, with a commercial solver, to investigate into several injection configurations with the aim of increasing the fuel regression rate and minimizing the consumption unevenness, but still favoring the establishment of flow recirculation at the motor head end, which is generated with an axial nozzle injector and has been demonstrated to promote combustion stability, and both larger efficiency and regression rate. All the computations have been performed on the configuration of a lab-scale hybrid rocket motor available at the propulsion laboratory of the University of Naples with typical operating conditions. After a preliminary comparison between the two baseline limiting cases of an axial subsonic nozzle injector and a uniform injection through the prechamber, a parametric analysis has been carried out by varying the oxidizer jet flow divergence angle, as well as the grain port diameter and the oxidizer mass flux to study the effect of the flow divergence on heat transfer distribution over the fuel surface. Some experimental firing test data are presented, and, under the hypothesis that fuel regression rate and surface heat flux are proportional, the measured fuel consumption axial profiles are compared with the predicted surface heat flux showing fairly good agreement, which allowed validating the employed design approach. Finally an optimized injector design is proposed.

  7. A Hybrid Approach for Scheduling and Replication based on Multi-criteria Decision Method in Grid Computing

    Directory of Open Access Journals (Sweden)

    Nadia Hadi

    2012-09-01

    Full Text Available Grid computing environments have emerged following the demand of scientists to have a very high computing power and storage capacity. One among the challenges imposed in the use of these environments is the performance problem. To improve performance, scheduling and replicating techniques are used. In this paper we propose an approach to task scheduling combined with data replication decision based on multi criteria principle. This is to improve performance by reducing the response time of tasks and the load of system. This hybrid approach is based on a non-hierarchical model that allows scalability.

  8. A Hybrid Approach Towards Intrusion Detection Based on Artificial Immune System and Soft Computing

    CERN Document Server

    Sanyal, Sugata

    2012-01-01

    A number of works in the field of intrusion detection have been based on Artificial Immune System and Soft Computing. Artificial Immune System based approaches attempt to leverage the adaptability, error tolerance, self- monitoring and distributed nature of Human Immune Systems. Whereas Soft Computing based approaches are instrumental in developing fuzzy rule based systems for detecting intrusions. They are computationally intensive and apply machine learning (both supervised and unsupervised) techniques to detect intrusions in a given system. A combination of these two approaches could provide significant advantages for intrusion detection. In this paper we attempt to leverage the adaptability of Artificial Immune System and the computation intensive nature of Soft Computing to develop a system that can effectively detect intrusions in a given network.

  9. Numerical methodologies for investigation of moderate-velocity flow using a hybrid computational fluid dynamics - molecular dynamics simulation approach

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Soon Heum [Linkoeping University, Linkoeping (Sweden); Kim, Na Yong; Nikitopoulos, Dimitris E.; Moldovan, Dorel [Louisiana State University, Baton Rouge (United States); Jha, Shantenu [Rutgers University, Piscataway (United States)

    2014-01-15

    Numerical approaches are presented to minimize the statistical errors inherently present due to finite sampling and the presence of thermal fluctuations in the molecular region of a hybrid computational fluid dynamics (CFD) - molecular dynamics (MD) flow solution. Near the fluid-solid interface the hybrid CFD-MD simulation approach provides a more accurate solution, especially in the presence of significant molecular-level phenomena, than the traditional continuum-based simulation techniques. It also involves less computational cost than the pure particle-based MD. Despite these advantages the hybrid CFD-MD methodology has been applied mostly in flow studies at high velocities, mainly because of the higher statistical errors associated with low velocities. As an alternative to the costly increase of the size of the MD region to decrease statistical errors, we investigate a few numerical approaches that reduce sampling noise of the solution at moderate-velocities. These methods are based on sampling of multiple simulation replicas and linear regression of multiple spatial/temporal samples. We discuss the advantages and disadvantages of each technique in the perspective of solution accuracy and computational cost.

  10. Numerical approach for solving kinetic equations in two-dimensional case on hybrid computational clusters

    Science.gov (United States)

    Malkov, Ewgenij A.; Poleshkin, Sergey O.; Kudryavtsev, Alexey N.; Shershnev, Anton A.

    2016-10-01

    The paper presents the software implementation of the Boltzmann equation solver based on the deterministic finite-difference method. The solver allows one to carry out parallel computations of rarefied flows on a hybrid computational cluster with arbitrary number of central processor units (CPU) and graphical processor units (GPU). Employment of GPUs leads to a significant acceleration of the computations, which enables us to simulate two-dimensional flows with high resolution in a reasonable time. The developed numerical code was validated by comparing the obtained solutions with the Direct Simulation Monte Carlo (DSMC) data. For this purpose the supersonic flow past a flat plate at zero angle of attack is used as a test case.

  11. Higher Order Modeling in Hybrid Approaches to the Computation of Electromagnetic Fields

    Science.gov (United States)

    Wilton, Donald R.; Fink, Patrick W.; Graglia, Roberto D.

    2000-01-01

    Higher order geometry representations and interpolatory basis functions for computational electromagnetics are reviewed. Two types of vector-valued basis functions are described: curl-conforming bases, used primarily in finite element solutions, and divergence-conforming bases used primarily in integral equation formulations. Both sets satisfy Nedelec constraints, which optimally reduce the number of degrees of freedom required for a given order. Results are presented illustrating the improved accuracy and convergence properties of higher order representations for hybrid integral equation and finite element methods.

  12. Computing membrane-AQP5-phosphatidylserine binding affinities with hybrid steered molecular dynamics approach.

    Science.gov (United States)

    Chen, Liao Y

    2015-01-01

    In order to elucidate how phosphatidylserine (PS6) interacts with AQP5 in a cell membrane, we developed a hybrid steered molecular dynamics (hSMD) method that involved: (1) Simultaneously steering two centers of mass of two selected segments of the ligand, and (2) equilibrating the ligand-protein complex with and without biasing the system. Validating hSMD, we first studied vascular endothelial growth factor receptor 1 (VEGFR1) in complex with N-(4-Chlorophenyl)-2-((pyridin-4-ylmethyl)amino)benzamide (8ST), for which the binding energy is known from in vitro experiments. In this study, our computed binding energy well agreed with the experimental value. Knowing the accuracy of this hSMD method, we applied it to the AQP5-lipid-bilayer system to answer an outstanding question relevant to AQP5's physiological function: Will the PS6, a lipid having a single long hydrocarbon tail that was found in the central pore of the AQP5 tetramer crystal, actually bind to and inhibit AQP5's central pore under near-physiological conditions, namely, when AQP5 tetramer is embedded in a lipid bilayer? We found, in silico, using the CHARMM 36 force field, that binding PS6 to AQP5 was a factor of 3 million weaker than "binding" it in the lipid bilayer. This suggests that AQP5's central pore will not be inhibited by PS6 or a similar lipid in a physiological environment.

  13. A Hybrid Autonomic Computing-Based Approach to Distributed Constraint Satisfaction Problems

    Directory of Open Access Journals (Sweden)

    Abhishek Bhatia

    2015-03-01

    Full Text Available Distributed constraint satisfaction problems (DisCSPs are among the widely endeavored problems using agent-based simulation. Fernandez et al. formulated sensor and mobile tracking problem as a DisCSP, known as SensorDCSP In this paper, we adopt a customized ERE (environment, reactive rules and entities algorithm for the SensorDCSP, which is otherwise proven as a computationally intractable problem. An amalgamation of the autonomy-oriented computing (AOC-based algorithm (ERE and genetic algorithm (GA provides an early solution of the modeled DisCSP. Incorporation of GA into ERE facilitates auto-tuning of the simulation parameters, thereby leading to an early solution of constraint satisfaction. This study further contributes towards a model, built up in the NetLogo simulation environment, to infer the efficacy of the proposed approach.

  14. Hybridity in Embedded Computing Systems

    Institute of Scientific and Technical Information of China (English)

    虞慧群; 孙永强

    1996-01-01

    An embedded system is a system that computer is used as a component in a larger device.In this paper,we study hybridity in embedded systems and present an interval based temporal logic to express and reason about hybrid properties of such kind of systems.

  15. Neural networks with multiple general neuron models: a hybrid computational intelligence approach using Genetic Programming.

    Science.gov (United States)

    Barton, Alan J; Valdés, Julio J; Orchard, Robert

    2009-01-01

    Classical neural networks are composed of neurons whose nature is determined by a certain function (the neuron model), usually pre-specified. In this paper, a type of neural network (NN-GP) is presented in which: (i) each neuron may have its own neuron model in the form of a general function, (ii) any layout (i.e network interconnection) is possible, and (iii) no bias nodes or weights are associated to the connections, neurons or layers. The general functions associated to a neuron are learned by searching a function space. They are not provided a priori, but are rather built as part of an Evolutionary Computation process based on Genetic Programming. The resulting network solutions are evaluated based on a fitness measure, which may, for example, be based on classification or regression errors. Two real-world examples are presented to illustrate the promising behaviour on classification problems via construction of a low-dimensional representation of a high-dimensional parameter space associated to the set of all network solutions.

  16. A Mathematical Approach to Hybridization

    Science.gov (United States)

    Matthews, P. S. C.; Thompson, J. J.

    1975-01-01

    Presents an approach to hybridization which exploits the similarities between the algebra of wave functions and vectors. This method will account satisfactorily for the number of orbitals formed when applied to hybrids involving the s and p orbitals. (GS)

  17. Dynamic modelling of an adsorption storage tank using a hybrid approach combining computational fluid dynamics and process simulation

    Science.gov (United States)

    Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.

    2004-01-01

    A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.

  18. Advanced Hybrid Computer Systems. Software Technology.

    Science.gov (United States)

    This software technology final report evaluates advances made in Advanced Hybrid Computer System software technology . The report describes what...automatic patching software is available as well as which analog/hybrid programming languages would be most feasible for the Advanced Hybrid Computer...compiler software . The problem of how software would interface with the hybrid system is also presented.

  19. A "Hybrid" Approach for Synthesizing Optimal Controllers of Hybrid Systems

    DEFF Research Database (Denmark)

    Zhao, Hengjun; Zhan, Naijun; Kapur, Deepak

    2012-01-01

    We propose an approach to reduce the optimal controller synthesis problem of hybrid systems to quantifier elimination; furthermore, we also show how to combine quantifier elimination with numerical computation in order to make it more scalable but at the same time, keep arising errors due...... to discretization manageable and within bounds. A major advantage of our approach is not only that it avoids errors due to numerical computation, but it also gives a better optimal controller. In order to illustrate our approach, we use the real industrial example of an oil pump provided by the German company HYDAC...

  20. Lyapunov exponents computation for hybrid neurons.

    Science.gov (United States)

    Bizzarri, Federico; Brambilla, Angelo; Gajani, Giancarlo Storti

    2013-10-01

    Lyapunov exponents are a basic and powerful tool to characterise the long-term behaviour of dynamical systems. The computation of Lyapunov exponents for continuous time dynamical systems is straightforward whenever they are ruled by vector fields that are sufficiently smooth to admit a variational model. Hybrid neurons do not belong to this wide class of systems since they are intrinsically non-smooth owing to the impact and sometimes switching model used to describe the integrate-and-fire (I&F) mechanism. In this paper we show how a variational model can be defined also for this class of neurons by resorting to saltation matrices. This extension allows the computation of Lyapunov exponent spectrum of hybrid neurons and of networks made up of them through a standard numerical approach even in the case of neurons firing synchronously.

  1. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    Energy Technology Data Exchange (ETDEWEB)

    Sergi, Pier Nicola, E-mail: p.sergi@sssup.it [Translational Neural Engineering Laboratory, The Biorobotics Institute, Scuola Superiore Sant' Anna, Viale Rinaldo Piaggio 34, Pontedera, 56025 (Italy); Jensen, Winnie [Department of Health Science and Technology, Fredrik Bajers Vej 7, 9220 Aalborg (Denmark); Yoshida, Ken [Department of Biomedical Engineering, Indiana University - Purdue University Indianapolis, 723 W. Michigan St., SL220, Indianapolis, IN 46202 (United States)

    2016-02-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature.

  2. Checkpointing for a hybrid computing node

    Energy Technology Data Exchange (ETDEWEB)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  3. A Hybrid Approach To Tandem Cylinder Noise

    Science.gov (United States)

    Lockard, David P.

    2004-01-01

    Aeolian tone generation from tandem cylinders is predicted using a hybrid approach. A standard computational fluid dynamics (CFD) code is used to compute the unsteady flow around the cylinders, and the acoustics are calculated using the acoustic analogy. The CFD code is nominally second order in space and time and includes several turbulence models, but the SST k - omega model is used for most of the calculations. Significant variation is observed between laminar and turbulent cases, and with changes in the turbulence model. A two-dimensional implementation of the Ffowcs Williams-Hawkings (FW-H) equation is used to predict the far-field noise.

  4. Reachability computation for hybrid systems with Ariadne

    NARCIS (Netherlands)

    L. Benvenuti; D. Bresolin; A. Casagrande; P.J. Collins (Pieter); A. Ferrari; E. Mazzi; T. Villa; A. Sangiovanni-Vincentelli

    2008-01-01

    htmlabstractAriadne is an in-progress open environment to design algorithms for computing with hybrid automata, that relies on a rigorous computable analysis theory to represent geometric objects, in order to achieve provable approximation bounds along the computations. In this paper we discuss the

  5. Computer code for intraply hybrid composite design

    Science.gov (United States)

    Chamis, C. C.; Sinclair, J. H.

    1981-01-01

    A computer program has been developed and is described herein for intraply hybrid composite design (INHYD). The program includes several composite micromechanics theories, intraply hybrid composite theories and a hygrothermomechanical theory. These theories provide INHYD with considerable flexibility and capability which the user can exercise through several available options. Key features and capabilities of INHYD are illustrated through selected samples.

  6. Universal blind quantum computation for hybrid system

    Science.gov (United States)

    Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang

    2017-08-01

    As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.

  7. An oligonucleotide hybridization approach to DNA sequencing.

    Science.gov (United States)

    Khrapko, K R; Lysov YuP; Khorlyn, A A; Shick, V V; Florentiev, V L; Mirzabekov, A D

    1989-10-09

    We have proposed a DNA sequencing method based on hybridization of a DNA fragment to be sequenced with the complete set of fixed-length oligonucleotides (e.g., 4(8) = 65,536 possible 8-mers) immobilized individually as dots of a 2-D matrix [(1989) Dokl. Akad. Nauk SSSR 303, 1508-1511]. It was shown that the list of hybridizing octanucleotides is sufficient for the computer-assisted reconstruction of the structures for 80% of random-sequence fragments up to 200 bases long, based on the analysis of the octanucleotide overlapping. Here a refinement of the method and some experimental data are presented. We have performed hybridizations with oligonucleotides immobilized on a glass plate, and obtained their dissociation curves down to heptanucleotides. Other approaches, e.g., an additional hybridization of short oligonucleotides which continuously extend duplexes formed between the fragment and immobilized oligonucleotides, should considerably increase either the probability of unambiguous reconstruction, or the length of reconstructed sequences, or decrease the size of immobilized oligonucleotides.

  8. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach.

    Science.gov (United States)

    Sergi, Pier Nicola; Jensen, Winnie; Yoshida, Ken

    2016-02-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves.

  9. Hybrid Systems: Computation and Control.

    Science.gov (United States)

    2007-11-02

    elbow) and a pinned first joint (shoul- der) (see Figure 2); it is termed an underactuated system since it is a mechanical system with fewer...Montreal, PQ, Canada, 1998. [10] M. W. Spong. Partial feedback linearization of underactuated mechanical systems . In Proceedings, IROS󈨢, pages 314-321...control mechanism and search for optimal combinations of control variables. Besides the nonlinear and hybrid nature of powertrain systems , hardware

  10. On a Variational Approach to Optimization of Hybrid Mechanical Systems

    Directory of Open Access Journals (Sweden)

    Vadim Azhmyakov

    2010-01-01

    Full Text Available This paper deals with multiobjective optimization techniques for a class of hybrid optimal control problems in mechanical systems. We deal with general nonlinear hybrid control systems described by boundary-value problems associated with hybrid-type Euler-Lagrange or Hamilton equations. The variational structure of the corresponding solutions makes it possible to reduce the original “mechanical” problem to an auxiliary multiobjective programming reformulation. This approach motivates possible applications of theoretical and computational results from multiobjective optimization related to the original dynamical optimization problem. We consider first order optimality conditions for optimal control problems governed by hybrid mechanical systems and also discuss some conceptual algorithms.

  11. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  12. Hybrid Parallel Computation of Integration in GRACE

    CERN Document Server

    Yuasa, F; Kawabata, S; Perret-Gallix, D; Itakura, K; Hotta, Y; Okuda, M; Yuasa, Fukuko; Ishikawa, Tadashi; Kawabata, Setsuya; Perret-Gallix, Denis; Itakura, Kazuhiro; Hotta, Yukihiko; Okuda, Motoi

    2000-01-01

    With an integrated software package {\\tt GRACE}, it is possible to generate Feynman diagrams, calculate the total cross section and generate physics events automatically. We outline the hybrid method of parallel computation of the multi-dimensional integration of {\\tt GRACE}. We used {\\tt MPI} (Message Passing Interface) as the parallel library and, to improve the performance we embedded the mechanism of the dynamic load balancing. The reduction rate of the practical execution time was studied.

  13. Hybrid Nanoelectronics: Future of Computer Technology

    Institute of Scientific and Technical Information of China (English)

    Wei Wang; Ming Liu; Andrew Hsu

    2006-01-01

    Nanotechnology may well prove to be the 21st century's new wave of scientific knowledge that transforms people's lives. Nanotechnology research activities are booming around the globe. This article reviews the recent progresses made on nanoelectronic research in US and China, and introduces several novel hybrid solutions specifically useful for future computer technology. These exciting new directions will lead to many future inventions, and have a huge impact to research communities and industries.

  14. HYBRID CONTROL APPROACH FOR CONTAINER CRANES

    Institute of Scientific and Technical Information of China (English)

    Wang Xiaojun; Shao Huihe

    2005-01-01

    A hybrid control approach is proposed to achieve the desired performance. Firstly a robust input shaper is designed to reduce the transient vibration and residual vibration of the container efficiently. Then a simple fuzzy logic controller is designed to eliminate the residual vibration completely in order to guarantee the positioning precision. Such a hybrid approach is simple in structure and readily realizable. Simulation results verify the fine performance of this hybrid control approach. It can achieve perfect elimination of residual vibration and concise positioning of the container load, and it is robust to parameter variations (mainly for cable length) and external disturbances.

  15. A hybrid transfinite element approach for nonlinear transient thermal analysis

    Science.gov (United States)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1987-01-01

    A new computational approach for transient nonlinear thermal analysis of structures is proposed. It is a hybrid approach which combines the modeling versatility of contemporary finite elements in conjunction with transform methods and classical Bubnov-Galerkin schemes. The present study is limited to nonlinearities due to temperature-dependent thermophysical properties. Numerical test cases attest to the basic capabilities and therein validate the transfinite element approach by means of comparisons with conventional finite element schemes and/or available solutions.

  16. A Hybrid Architecture Approach for Quantum Algorithms

    Directory of Open Access Journals (Sweden)

    Mohammad R.S. Aghaei

    2009-01-01

    Full Text Available Problem statement: In this study, a general plan of hybrid architecture for quantum algorithms is proposed. Approach: Analysis of the quantum algorithms shows that these algorithms were hybrid with two parts. First, the relationship of classical and quantum parts of the hybrid algorithms was extracted. Then a general plan of hybrid structure was designed. Results: This plan was illustrated the hybrid architecture and the relationship of classical and quantum parts of the algorithms. This general plan was used to increase implementation performance of quantum algorithms. Conclusion/Recommendations: Moreover, simulation results of quantum algorithms on the hybrid architecture proved that quantum algorithms can be implemented on the general plan as well.

  17. A hybrid computational grid architecture for comparative genomics.

    Science.gov (United States)

    Singh, Aarti; Chen, Chen; Liu, Weiguo; Mitchell, Wayne; Schmidt, Bertil

    2008-03-01

    Comparative genomics provides a powerful tool for studying evolutionary changes among organisms, helping to identify genes that are conserved among species, as well as genes that give each organism its unique characteristics. However, the huge datasets involved makes this approach impractical on traditional computer architectures leading to prohibitively long runtimes. In this paper, we present a new computational grid architecture based on a hybrid computing model to significantly accelerate comparative genomics applications. The hybrid computing model consists of two types of parallelism: coarse grained and fine grained. The coarse-grained parallelism uses a volunteer computing infrastructure for job distribution, while the fine-grained parallelism uses commodity computer graphics hardware for fast sequence alignment. We present the deployment and evaluation of this approach on our grid test bed for the all-against-all comparison of microbial genomes. The results of this comparison are then used by phenotype--genotype explorer (PheGee). PheGee is a new tool that nominates candidate genes responsible for a given phenotype.

  18. Hybrid Algorithm for Optimal Load Sharing in Grid Computing

    Directory of Open Access Journals (Sweden)

    A. Krishnan

    2012-01-01

    Full Text Available Problem statement: Grid Computing is the fast growing industry, which shares the resources in the organization in an effective manner. Resource sharing requires more optimized algorithmic structure, otherwise the waiting time and response time are increased and the resource utilization is reduced. Approach: In order to avoid such reduction in the performances of the grid system, an optimal resource sharing algorithm is required. In recent days, many load sharing technique are proposed, which provides feasibility but there are many critical issues are still present in these algorithms. Results: In this study a hybrid algorithm for optimization of load sharing is proposed. The hybrid algorithm contains two components which are Hash Table (HT and Distributed Hash Table (DHT. Conclusion: The results of the proposed study show that the hybrid algorithm will optimize the task than existing systems.

  19. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  20. Integrated approach for hybrid rocket technology development

    Science.gov (United States)

    Barato, Francesco; Bellomo, Nicolas; Pavarin, Daniele

    2016-11-01

    Hybrid rocket motors tend generally to be simple from a mechanical point of view but difficult to optimize because of their complex and still not well understood cross-coupled physics. This paper addresses the previous issue presenting the integrated approach established at University of Padua to develop hybrid rocket based systems. The methodology tightly combines together system analysis and design, numerical modeling from elementary to sophisticated CFD, and experimental testing done with incremental philosophy. As an example of the approach, the paper presents the experience done in the successful development of a hybrid rocket booster designed for rocket assisted take off operations. It is thought that following the proposed approach and selecting carefully the most promising applications it is possible to finally exploit the major advantages of hybrid rocket motors as safety, simplicity, low cost and reliability.

  1. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  2. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  3. Toward feasible and comprehensive computational protocol for simulation of the spectroscopic properties of large molecular systems: the anharmonic infrared spectrum of uracil in the solid state by the reduced dimensionality/hybrid VPT2 approach.

    Science.gov (United States)

    Fornaro, Teresa; Carnimeo, Ivan; Biczysko, Malgorzata

    2015-05-28

    Feasible and comprehensive computational protocols for simulating the spectroscopic properties of large and complex molecular systems are very sought after. Indeed, due to the great variety of intra- and intermolecular interactions that may take place, the interpretation of experimental data becomes more and more difficult as the system under study increases in size or is placed in a complex environment, such as condensed phases. In this framework, we are actively developing a comprehensive and robust computational protocol aimed at quantitative reproduction of the spectra of nucleic acid base complexes, with increasing complexity toward condensed phases and monolayers of biomolecules on solid supports. We have resorted to fully anharmonic quantum mechanical computations within the generalized second-order vibrational perturbation theory (GVPT2) approach, combined with the cost-effective B3LYP-D3 method, in conjunction with basis sets of double-ζ plus polarization quality. Such an approach has been validated in a previous work ( Phys. Chem. Chem. Phys. 2014 , 16 , 10112 - 10128 ) for simulating the IR spectra of the monomers of nucleobases and some of their dimers. In the present contribution we have extended such computational protocol to simulate spectroscopic properties of a molecular solid, namely polycrystalline uracil. First we have selected a realistic molecular model for representing the spectroscopic properties of uracil in the solid state, the uracil heptamer, and then we have computed the relative anharmonic frequencies combining less demanding approaches such as the hybrid B3LYP-D3/DFTBA one, in which the harmonic frequencies are computed at a higher level of theory (B3LYP-D3/N07D) whereas the anharmonic shifts are evaluated at a lower level of theory (DFTBA), and the reduced dimensionality VPT2 (RD-VPT2) approach, where only selected vibrational modes are computed anharmonically along with the couplings with other modes. The good agreement between the

  4. Viability of Hybrid Systems A Controllability Operator Approach

    CERN Document Server

    Labinaz, G

    2012-01-01

    The problem of viability of hybrid systems is considered in this work. A model for a hybrid system is developed including a means of including three forms of uncertainty: transition dynamics, structural uncertainty, and parametric uncertainty. A computational basis for viability of hybrid systems is developed and applied to three control law classes. An approach is developed for robust viability based on two extensions of the controllability operator. The three-tank example is examined for both the viability problem and robust viability problem. The theory is applied through simulation to an active magnetic bearing system and to a batch polymerization process showing that viability can be satisfied in practice. The problem of viable attainability is examined based on the controllability operator approach introduced by Nerode and colleagues. Lastly, properties of the controllability operator are presented.

  5. Detection of cardiovascular anomalies: Hybrid systems approach

    KAUST Repository

    Ledezma, Fernando

    2012-06-06

    In this paper, we propose a hybrid interpretation of the cardiovascular system. Based on a model proposed by Simaan et al. (2009), we study the problem of detecting cardiovascular anomalies that can be caused by variations in some physiological parameters, using an observerbased approach. We present the first numerical results obtained. © 2012 IFAC.

  6. A Hybrid Approach for Correcting Grammatical Errors

    Science.gov (United States)

    Lee, Kiyoung; Kwon, Oh-Woog; Kim, Young-Kil; Lee, Yunkeun

    2015-01-01

    This paper presents a hybrid approach for correcting grammatical errors in the sentences uttered by Korean learners of English. The error correction system plays an important role in GenieTutor, which is a dialogue-based English learning system designed to teach English to Korean students. During the talk with GenieTutor, grammatical error…

  7. Hybrid centralized pre-computing/local distributed optimization of shared disjoint-backup path approach to GMPLS optical mesh network intelligent restoration

    Science.gov (United States)

    Gong, Qian; Xu, Rong; Lin, Jintong

    2004-04-01

    Wavelength Division Multiplexed (WDM) networks that route optical connections using intelligent optical cross-connects (OXCs) is firmly established as the core constituent of next generation networks. Rapid failure recovery is fundamental to building reliable transport networks. Mesh restoration promises cost effective failure recovery compared with legacy ring networks, and is now seeing large-scale deployment. Many carriers are migrating away from SONET ring restoration for their core transport networks and replacing it with mesh restoration through "intelligent" O-E-O cross-connects (XC). The mesh restoration is typically provided via two fiber-disjoint paths: a service path and a restoration path. this scheme can restore any single link failure or node failure. And by used shared mesh restoration, although every service route is assigned a restoration route, no dedicated capacity needs to be reserved for the restoration route, resulting in capacity savings. The restoration approach we propose is Centralized Pre-computing, Local Distributed Optimization, and Shared Disjoint-backup Path. This approach combines the merits of centralized and distributed solutions. It avoids the scalability issues of centralized solutions by using a distributed control plane for disjoint service path computation and restoration path provisioning. Moreover, if the service routes of two demands are disjoint, no single failure will affect both demands simultaneously. This means that the restoration routes of these two demands can share link capacities, because these two routes will not be activated at the same time. So we can say, this restoration capacity sharing approach achieves low restoration capacity and fast restoration speed, while requiring few control plane changes.

  8. CSP: A Multifaceted Hybrid Architecture for Space Computing

    Science.gov (United States)

    Rudolph, Dylan; Wilson, Christopher; Stewart, Jacob; Gauvin, Patrick; George, Alan; Lam, Herman; Crum, Gary Alex; Wirthlin, Mike; Wilson, Alex; Stoddard, Aaron

    2014-01-01

    Research on the CHREC Space Processor (CSP) takes a multifaceted hybrid approach to embedded space computing. Working closely with the NASA Goddard SpaceCube team, researchers at the National Science Foundation (NSF) Center for High-Performance Reconfigurable Computing (CHREC) at the University of Florida and Brigham Young University are developing hybrid space computers that feature an innovative combination of three technologies: commercial-off-the-shelf (COTS) devices, radiation-hardened (RadHard) devices, and fault-tolerant computing. Modern COTS processors provide the utmost in performance and energy-efficiency but are susceptible to ionizing radiation in space, whereas RadHard processors are virtually immune to this radiation but are more expensive, larger, less energy-efficient, and generations behind in speed and functionality. By featuring COTS devices to perform the critical data processing, supported by simpler RadHard devices that monitor and manage the COTS devices, and augmented with novel uses of fault-tolerant hardware, software, information, and networking within and between COTS devices, the resulting system can maximize performance and reliability while minimizing energy consumption and cost. NASA Goddard has adopted the CSP concept and technology with plans underway to feature flight-ready CSP boards on two upcoming space missions.

  9. Hybrid cloud and cluster computing paradigms for life science applications.

    Science.gov (United States)

    Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey

    2010-12-21

    Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.

  10. Approaches to hybrid synthetic devices

    Science.gov (United States)

    Verma, Vivek

    All living creatures are made up of cells that have the ability to replicate themselves in a repetitive process called cell division. As these cells mature and divide into two there is an extensive movement of cellular components. In order to perform this essential task that sustains life, cells have evolved machines composed of proteins. Biological motors, such as kinesin, transport intracellular cargo and position organelles in eukaryotic cells via unidirectional movement on cytoskeletal tracts called microtubules. Biomolecular motor proteins have the potential to be used as 'nano-engines' for switchable devices, directed self assembly, controlled bioseparations and powering nano- and microelectromechanical systems. However, engineering such systems requires fabrication processes that are compatible with biological materials such as kinesin motor proteins and microtubules. The first objective of the research was to establish biocompatibility between protein systems and nanofabrication. The second objective was to use current micro- and nanofabrication techniques for patterning proteins at specific locations and to study role of casein in supporting the operation of surface bound kinesin. The third objective was to link kinesin and microtubule system to cellulose nanowhiskers. The effects of micro- and nanofabrication processing chemicals and resists on the functionality of casein, kinesin, and microtubule proteins are systematically examined to address the important missing link of the biocompatibility of micro- and nanofabrication processes needed to realize hybrid system fabrication. It was found that both casein, which is used to prevent motor denaturation on surfaces, and kinesin motors are surprisingly tolerant of most of the processing chemicals examined. Microtubules, however, are much more sensitive. Exposure to the processing chemicals leads to depolymerization, which is partially attributed to the pH of the solutions examined. When the chemicals were

  11. Hybrid continuum-atomistic approach to model electrokinetics in nanofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Amani, Ehsan, E-mail: eamani@aut.ac.ir; Movahed, Saeid, E-mail: smovahed@aut.ac.ir

    2016-06-07

    In this study, for the first time, a hybrid continuum-atomistic based model is proposed for electrokinetics, electroosmosis and electrophoresis, through nanochannels. Although continuum based methods are accurate enough to model fluid flow and electric potential in nanofluidics (in dimensions larger than 4 nm), ionic concentration is too low in nanochannels for the continuum assumption to be valid. On the other hand, the non-continuum based approaches are too time-consuming and therefore is limited to simple geometries, in practice. Here, to propose an efficient hybrid continuum-atomistic method of modelling the electrokinetics in nanochannels; the fluid flow and electric potential are computed based on continuum hypothesis coupled with an atomistic Lagrangian approach for the ionic transport. The results of the model are compared to and validated by the results of the molecular dynamics technique for a couple of case studies. Then, the influences of bulk ionic concentration, external electric field, size of nanochannel, and surface electric charge on the electrokinetic flow and ionic mass transfer are investigated, carefully. The hybrid continuum-atomistic method is a promising approach to model more complicated geometries and investigate more details of the electrokinetics in nanofluidics. - Highlights: • A hybrid continuum-atomistic model is proposed for electrokinetics in nanochannels. • The model is validated by molecular dynamics. • This is a promising approach to model more complicated geometries and physics.

  12. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  13. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  14. A Massive Data Parallel Computational Framework for Petascale/Exascale Hybrid Computer Systems

    CERN Document Server

    Blazewicz, Marek; Diener, Peter; Koppelman, David M; Kurowski, Krzysztof; Löffler, Frank; Schnetter, Erik; Tao, Jian

    2012-01-01

    Heterogeneous systems are becoming more common on High Performance Computing (HPC) systems. Even using tools like CUDA and OpenCL it is a non-trivial task to obtain optimal performance on the GPU. Approaches to simplifying this task include Merge (a library based framework for heterogeneous multi-core systems), Zippy (a framework for parallel execution of codes on multiple GPUs), BSGP (a new programming language for general purpose computation on the GPU) and CUDA-lite (an enhancement to CUDA that transforms code based on annotations). In addition, efforts are underway to improve compiler tools for automatic parallelization and optimization of affine loop nests for GPUs and for automatic translation of OpenMP parallelized codes to CUDA. In this paper we present an alternative approach: a new computational framework for the development of massively data parallel scientific codes applications suitable for use on such petascale/exascale hybrid systems built upon the highly scalable Cactus framework. As the first...

  15. Computational hybrid anthropometric paediatric phantom library for internal radiation dosimetry

    Science.gov (United States)

    Xie, Tianwu; Kuster, Niels; Zaidi, Habib

    2017-04-01

    Hybrid computational phantoms combine voxel-based and simplified equation-based modelling approaches to provide unique advantages and more realism for the construction of anthropomorphic models. In this work, a methodology and C++ code are developed to generate hybrid computational phantoms covering statistical distributions of body morphometry in the paediatric population. The paediatric phantoms of the Virtual Population Series (IT’IS Foundation, Switzerland) were modified to match target anthropometric parameters, including body mass, body length, standing height and sitting height/stature ratio, determined from reference databases of the National Centre for Health Statistics and the National Health and Nutrition Examination Survey. The phantoms were selected as representative anchor phantoms for the newborn, 1, 2, 5, 10 and 15 years-old children, and were subsequently remodelled to create 1100 female and male phantoms with 10th, 25th, 50th, 75th and 90th body morphometries. Evaluation was performed qualitatively using 3D visualization and quantitatively by analysing internal organ masses. Overall, the newly generated phantoms appear very reasonable and representative of the main characteristics of the paediatric population at various ages and for different genders, body sizes and sitting stature ratios. The mass of internal organs increases with height and body mass. The comparison of organ masses of the heart, kidney, liver, lung and spleen with published autopsy and ICRP reference data for children demonstrated that they follow the same trend when correlated with age. The constructed hybrid computational phantom library opens up the prospect of comprehensive radiation dosimetry calculations and risk assessment for the paediatric population of different age groups and diverse anthropometric parameters.

  16. A Review of Hybrid Brain-Computer Interface Systems

    Directory of Open Access Journals (Sweden)

    Setare Amiri

    2013-01-01

    Full Text Available Increasing number of research activities and different types of studies in brain-computer interface (BCI systems show potential in this young research area. Research teams have studied features of different data acquisition techniques, brain activity patterns, feature extraction techniques, methods of classifications, and many other aspects of a BCI system. However, conventional BCIs have not become totally applicable, due to the lack of high accuracy, reliability, low information transfer rate, and user acceptability. A new approach to create a more reliable BCI that takes advantage of each system is to combine two or more BCI systems with different brain activity patterns or different input signal sources. This type of BCI, called hybrid BCI, may reduce disadvantages of each conventional BCI system. In addition, hybrid BCIs may create more applications and possibly increase the accuracy and the information transfer rate. However, the type of BCIs and their combinations should be considered carefully. In this paper, after introducing several types of BCIs and their combinations, we review and discuss hybrid BCIs, different possibilities to combine them, and their advantages and disadvantages.

  17. Computational Approaches to Interface Design

    Science.gov (United States)

    Corker; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    Tools which make use of computational processes - mathematical, algorithmic and/or knowledge-based - to perform portions of the design, evaluation and/or construction of interfaces have become increasingly available and powerful. Nevertheless, there is little agreement as to the appropriate role for a computational tool to play in the interface design process. Current tools fall into broad classes depending on which portions, and how much, of the design process they automate. The purpose of this panel is to review and generalize about computational approaches developed to date, discuss the tasks which for which they are suited, and suggest methods to enhance their utility and acceptance. Panel participants represent a wide diversity of application domains and methodologies. This should provide for lively discussion about implementation approaches, accuracy of design decisions, acceptability of representational tradeoffs and the optimal role for a computational tool to play in the interface design process.

  18. Hybrid computing: CPU+GPU co-processing and its application to tomographic reconstruction.

    Science.gov (United States)

    Agulleiro, J I; Vázquez, F; Garzón, E M; Fernández, J J

    2012-04-01

    Modern computers are equipped with powerful computing engines like multicore processors and GPUs. The 3DEM community has rapidly adapted to this scenario and many software packages now make use of high performance computing techniques to exploit these devices. However, the implementations thus far are purely focused on either GPUs or CPUs. This work presents a hybrid approach that collaboratively combines the GPUs and CPUs available in a computer and applies it to the problem of tomographic reconstruction. Proper orchestration of workload in such a heterogeneous system is an issue. Here we use an on-demand strategy whereby the computing devices request a new piece of work to do when idle. Our hybrid approach thus takes advantage of the whole computing power available in modern computers and further reduces the processing time. This CPU+GPU co-processing can be readily extended to other image processing tasks in 3DEM. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  20. A hybrid approach to simulating mechanical properties of polymer nanocomposites.

    Science.gov (United States)

    Mccarron, Andy P; Raj, Sharad; Hyers, Robert; Kim, Moon K

    2009-12-01

    Empirical studies indicate that a polymer reinforced with nanoscale particles could enhance its mechanical properties such as stiffness and toughness. To give insight into how and why this nanoparticle reinforcement is effective, it is necessary to develop computational models that can accurately simulate the effects of nanoparticles on the fracture characteristics of polymer composites. Furthermore, a hybrid model that can account for both continuum and non-continuum effects will hasten the development of not only new hierarchical composite materials but also new theories to explain their behavior. This paper presents a hybrid modeling scheme for simulating fracture of polymer nanocomposites by utilizing an atomistic modeling approach called Elastic Network Model (ENM) in conjunction with a traditional Finite Element Analysis (FEA). The novelty of this hybrid ENM-FEA approach lies in its ability to model less interesting outer domains with FEA while still accounting for areas of interest such as crack tip reion and the interface between a nanoparticle and the polymer matrix at atomic scale with ENM. Various simulation conditions have been tested to determine the feasibility of the proposed hybrid model. For instance, an iterative result from a uniaxial loading with isotropic properties in an ENM-FEA model shows accuracy and convergence to the analytic solution.

  1. An Approach with Hybrid Segmental Mechanics.

    Science.gov (United States)

    Mishra, Harsh Ashok; Maurya, Raj Kumar

    2016-06-01

    Present case report provides an insight into the hybrid segmental mechanics with treatment of 13-year-old male, considering the side effects of sole continuous arch wire sliding mechanics. Patient was diagnosed as a case of skeletal class I jaw relationship, low mandibular plane angle, class II molar relation on right and class I molar relation on left side, anterior cross bite, crowding of 12mm in upper, 5mm in lower arch. He also had proclined upper and lower anteriors by 2mm, convex profile and incompetent lips. Total treatment duration was 20 months, during which segmental canine retraction was performed with TMA (Titanium, Molybdenum, Aluminum) 'T' loop retraction spring followed by consolidation of spaces with continuous arch mechanics. Most of the treatment objectives were met with good intraoral and facial results within reasonable framework of time. This approach used traditional twin brackets, which offered the versatility to use continuous arch-wire mechanics, segmental mechanics and hybrid sectional mechanics.

  2. Hybrid Computational Model for High-Altitude Aeroassist Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A hybrid continuum/noncontinuum computational model will be developed for analyzing the aerodynamics and heating on aeroassist vehicles. Unique features of this...

  3. Hybrid grid-particle methods and Penalization: A Sherman-Morrison-Woodbury approach to compute 3D viscous flows using FFT

    Science.gov (United States)

    Chatelin, Robin; Poncet, Philippe

    2014-07-01

    Particle methods are very convenient to compute transport equations in fluid mechanics as their computational cost is linear and they are not limited by convection stability conditions. To achieve large 3D computations the method must be coupled to efficient algorithms for velocity computations, including a good treatment of non-homogeneities and complex moving geometries. The Penalization method enables to consider moving bodies interaction by adding a term in the conservation of momentum equation. This work introduces a new computational algorithm to solve implicitly in the same step the Penalization term and the Laplace operators, since explicit computations are limited by stability issues, especially at low Reynolds number. This computational algorithm is based on the Sherman-Morrison-Woodbury formula coupled to a GMRES iterative method to reduce the computations to a sequence of Poisson problems: this allows to formulate a penalized Poisson equation as a large perturbation of a standard Poisson, by means of algebraic relations. A direct consequence is the possibility to use fast solvers based on Fast Fourier Transforms for this problem with good efficiency from both the computational and the memory consumption point of views, since these solvers are recursive and they do not perform any matrix assembling. The resulting fluid mechanics computations are very fast and they consume a small amount of memory, compared to a reference solver or a linear system resolution. The present applications focus mainly on a coupling between transport equation and 3D Stokes equations, for studying biological organisms motion in a highly viscous flows with variable viscosity.

  4. Digital Potentiometer for Hybrid Computer EAI 680-PDP-8/I

    DEFF Research Database (Denmark)

    Højberg, Kristian Søe; Olsen, Jens V.

    1974-01-01

    In this article a description is given of a 12 bit digital potentiometer for hybrid computer application. The system is composed of standard building blocks. Emphasis is laid on the development problems met and the problem solutions developed.......In this article a description is given of a 12 bit digital potentiometer for hybrid computer application. The system is composed of standard building blocks. Emphasis is laid on the development problems met and the problem solutions developed....

  5. Hybrid Quantum-Classical Approach to Correlated Materials

    Science.gov (United States)

    Bauer, Bela; Wecker, Dave; Millis, Andrew J.; Hastings, Matthew B.; Troyer, Matthias

    2016-07-01

    Recent improvements in the control of quantum systems make it seem feasible to finally build a quantum computer within a decade. While it has been shown that such a quantum computer can in principle solve certain small electronic structure problems and idealized model Hamiltonians, the highly relevant problem of directly solving a complex correlated material appears to require a prohibitive amount of resources. Here, we show that by using a hybrid quantum-classical algorithm that incorporates the power of a small quantum computer into a framework of classical embedding algorithms, the electronic structure of complex correlated materials can be efficiently tackled using a quantum computer. In our approach, the quantum computer solves a small effective quantum impurity problem that is self-consistently determined via a feedback loop between the quantum and classical computation. Use of a quantum computer enables much larger and more accurate simulations than with any known classical algorithm, and will allow many open questions in quantum materials to be resolved once a small quantum computer with around 100 logical qubits becomes available.

  6. Hybrid system for computing reachable workspaces for redundant manipulators

    Science.gov (United States)

    Alameldin, Tarek K.; Sobh, Tarek M.

    1991-03-01

    An efficient computation of 3D workspaces for redundant manipulators is based on a " hybrid" a!- gorithm between direct kinematics and screw theory. Direct kinematics enjoys low computational cost but needs edge detection algorithms when workspace boundaries are needed. Screw theory has exponential computational cost per workspace point but does not need edge detection. Screw theory allows computing workspace points in prespecified directions while direct kinematics does not. Applications of the algorithm are discussed.

  7. Using a Hybrid Approach to Facilitate Learning Introductory Programming

    Science.gov (United States)

    Cakiroglu, Unal

    2013-01-01

    In order to facilitate students' understanding in introductory programming courses, different types of teaching approaches were conducted. In this study, a hybrid approach including comment first coding (CFC), analogy and template approaches were used. The goal was to investigate the effect of such a hybrid approach on students' understanding in…

  8. An Adaptive and Hybrid Approach for Revisiting the Visibility Pipeline

    Directory of Open Access Journals (Sweden)

    Ícaro Lins Leitão da Cunha

    2016-04-01

    Full Text Available We revisit the visibility problem, which is traditionally known in Computer Graphics and Vision fields as the process of computing a (potentially visible set of primitives in the computational model of a scene. We propose a hybrid solution that uses a dry structure (in the sense of data reduction, a triangulation of the type J1a, to accelerate the task of searching for visible primitives. We came up with a solution that is useful for real-time, on-line, interactive applications as 3D visualization. In such applications the main goal is to load the minimum amount of primitives from the scene during the rendering stage, as possible. For this purpose, our algorithm executes the culling by using a hybrid paradigm based on viewing-frustum, back-face culling and occlusion models. Results have shown substantial improvement over these traditional approaches if applied separately. This novel approach can be used in devices with no dedicated processors or with low processing power, as cell phones or embedded displays, or to visualize data through the Internet, as in virtual museums applications.

  9. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    Science.gov (United States)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  10. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    Science.gov (United States)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  11. Hybrid computing: CPU+GPU co-processing and its application to tomographic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Agulleiro, J.I.; Vazquez, F.; Garzon, E.M. [Supercomputing and Algorithms Group, Associated Unit CSIC-UAL, University of Almeria, 04120 Almeria (Spain); Fernandez, J.J., E-mail: JJ.Fernandez@csic.es [National Centre for Biotechnology, National Research Council (CNB-CSIC), Campus UAM, C/Darwin 3, Cantoblanco, 28049 Madrid (Spain)

    2012-04-15

    Modern computers are equipped with powerful computing engines like multicore processors and GPUs. The 3DEM community has rapidly adapted to this scenario and many software packages now make use of high performance computing techniques to exploit these devices. However, the implementations thus far are purely focused on either GPUs or CPUs. This work presents a hybrid approach that collaboratively combines the GPUs and CPUs available in a computer and applies it to the problem of tomographic reconstruction. Proper orchestration of workload in such a heterogeneous system is an issue. Here we use an on-demand strategy whereby the computing devices request a new piece of work to do when idle. Our hybrid approach thus takes advantage of the whole computing power available in modern computers and further reduces the processing time. This CPU+GPU co-processing can be readily extended to other image processing tasks in 3DEM. -- Highlights: Black-Right-Pointing-Pointer Hybrid computing allows full exploitation of the power (CPU+GPU) in a computer. Black-Right-Pointing-Pointer Proper orchestration of workload is managed by an on-demand strategy. Black-Right-Pointing-Pointer Total number of threads running in the system should be limited to the number of CPUs.

  12. Forecasting conditional climate-change using a hybrid approach

    Science.gov (United States)

    Esfahani, Akbar Akbari; Friedel, Michael J.

    2014-01-01

    A novel approach is proposed to forecast the likelihood of climate-change across spatial landscape gradients. This hybrid approach involves reconstructing past precipitation and temperature using the self-organizing map technique; determining quantile trends in the climate-change variables by quantile regression modeling; and computing conditional forecasts of climate-change variables based on self-similarity in quantile trends using the fractionally differenced auto-regressive integrated moving average technique. The proposed modeling approach is applied to states (Arizona, California, Colorado, Nevada, New Mexico, and Utah) in the southwestern U.S., where conditional forecasts of climate-change variables are evaluated against recent (2012) observations, evaluated at a future time period (2030), and evaluated as future trends (2009–2059). These results have broad economic, political, and social implications because they quantify uncertainty in climate-change forecasts affecting various sectors of society. Another benefit of the proposed hybrid approach is that it can be extended to any spatiotemporal scale providing self-similarity exists.

  13. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...

  14. Stock selection using a hybrid MCDM approach

    Directory of Open Access Journals (Sweden)

    Tea Poklepović

    2014-12-01

    Full Text Available The problem of selecting the right stocks to invest in is of immense interest for investors on both emerging and developed capital markets. Moreover, an investor should take into account all available data regarding stocks on the particular market. This includes fundamental and stock market indicators. The decision making process includes several stocks to invest in and more than one criterion. Therefore, the task of selecting the stocks to invest in can be viewed as a multiple criteria decision making (MCDM problem. Using several MCDM methods often leads to divergent rankings. The goal of this paper is to resolve these possible divergent results obtained from different MCDM methods using a hybrid MCDM approach based on Spearman’s rank correlation coefficient. Five MCDM methods are selected: COPRAS, linear assignment, PROMETHEE, SAW and TOPSIS. The weights for all criteria are obtained by using the AHP method. Data for this study includes information on stock returns and traded volumes from March 2012 to March 2014 for 19 stocks on the Croatian capital market. It also includes the most important fundamental and stock market indicators for selected stocks. Rankings using five selected MCDM methods in the stock selection problem yield divergent results. However, after applying the proposed approach the final hybrid rankings are obtained. The results show that the worse stocks to invest in happen to be the same when the industry is taken into consideration or when not. However, when the industry is taken into account, the best stocks to invest in are slightly different, because some industries are more profitable than the others.

  15. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  16. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  17. A Fast Hybrid Approach to Air Shower Simulations and Applications

    CERN Document Server

    Drescher, H J; Bleicher, M; Reiter, M; Soff, S; Stöcker, H; Stoecker, Horst

    2003-01-01

    The SENECA model, a new hybrid approach to air shower simulations, is presented. It combines the use of efficient cascade equations in the energy range where a shower can be treated as one-dimensional, with a traditional Monte Carlo method which traces individual particles. This allows one to reproduce natural fluctuations of individual showers as well as the lateral spread of low energy particles. The model is quite efficient in computation time. As an application of the new approach, the influence of the low energy hadronic models on shower properties for AUGER energies is studied. We conclude that these models have a significant impact on the tails of lateral distribution functions, and deserve therefore more attention.

  18. Multilayer Approach for Advanced Hybrid Lithium Battery

    KAUST Repository

    Ming, Jun

    2016-06-06

    Conventional intercalated rechargeable batteries have shown their capacity limit, and the development of an alternative battery system with higher capacity is strongly needed for sustainable electrical vehicles and hand-held devices. Herein, we introduce a feasible and scalable multilayer approach to fabricate a promising hybrid lithium battery with superior capacity and multivoltage plateaus. A sulfur-rich electrode (90 wt % S) is covered by a dual layer of graphite/Li4Ti5O12, where the active materials S and Li4Ti5O12 can both take part in redox reactions and thus deliver a high capacity of 572 mAh gcathode -1 (vs the total mass of electrode) or 1866 mAh gs -1 (vs the mass of sulfur) at 0.1C (with the definition of 1C = 1675 mA gs -1). The battery shows unique voltage platforms at 2.35 and 2.1 V, contributed from S, and 1.55 V from Li4Ti5O12. A high rate capability of 566 mAh gcathode -1 at 0.25C and 376 mAh gcathode -1 at 1C with durable cycle ability over 100 cycles can be achieved. Operando Raman and electron microscope analysis confirm that the graphite/Li4Ti5O12 layer slows the dissolution/migration of polysulfides, thereby giving rise to a higher sulfur utilization and a slower capacity decay. This advanced hybrid battery with a multilayer concept for marrying different voltage plateaus from various electrode materials opens a way of providing tunable capacity and multiple voltage platforms for energy device applications. © 2016 American Chemical Society.

  19. Computational approaches for drug discovery.

    Science.gov (United States)

    Hung, Che-Lun; Chen, Chi-Chun

    2014-09-01

    Cellular proteins are the mediators of multiple organism functions being involved in physiological mechanisms and disease. By discovering lead compounds that affect the function of target proteins, the target diseases or physiological mechanisms can be modulated. Based on knowledge of the ligand-receptor interaction, the chemical structures of leads can be modified to improve efficacy, selectivity and reduce side effects. One rational drug design technology, which enables drug discovery based on knowledge of target structures, functional properties and mechanisms, is computer-aided drug design (CADD). The application of CADD can be cost-effective using experiments to compare predicted and actual drug activity, the results from which can used iteratively to improve compound properties. The two major CADD-based approaches are structure-based drug design, where protein structures are required, and ligand-based drug design, where ligand and ligand activities can be used to design compounds interacting with the protein structure. Approaches in structure-based drug design include docking, de novo design, fragment-based drug discovery and structure-based pharmacophore modeling. Approaches in ligand-based drug design include quantitative structure-affinity relationship and pharmacophore modeling based on ligand properties. Based on whether the structure of the receptor and its interaction with the ligand are known, different design strategies can be seed. After lead compounds are generated, the rule of five can be used to assess whether these have drug-like properties. Several quality validation methods, such as cost function analysis, Fisher's cross-validation analysis and goodness of hit test, can be used to estimate the metrics of different drug design strategies. To further improve CADD performance, multi-computers and graphics processing units may be applied to reduce costs.

  20. Cost Optimization Using Hybrid Evolutionary Algorithm in Cloud Computing

    Directory of Open Access Journals (Sweden)

    B. Kavitha

    2015-07-01

    Full Text Available The main aim of this research is to design the hybrid evolutionary algorithm for minimizing multiple problems of dynamic resource allocation in cloud computing. The resource allocation is one of the big problems in the distributed systems when the client wants to decrease the cost for the resource allocation for their task. In order to assign the resource for the task, the client must consider the monetary cost and computational cost. Allocation of resources by considering those two costs is difficult. To solve this problem in this study, we make the main task of client into many subtasks and we allocate resources for each subtask instead of selecting the single resource for the main task. The allocation of resources for the each subtask is completed through our proposed hybrid optimization algorithm. Here, we hybrid the Binary Particle Swarm Optimization (BPSO and Binary Cuckoo Search algorithm (BCSO by considering monetary cost and computational cost which helps to minimize the cost of the client. Finally, the experimentation is carried out and our proposed hybrid algorithm is compared with BPSO and BCSO algorithms. Also we proved the efficiency of our proposed hybrid optimization algorithm.

  1. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  2. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  3. Load flow computations in hybrid transmission - distributed power systems

    NARCIS (Netherlands)

    Wobbes, E.D.; Lahaye, D.J.P.

    2013-01-01

    We interconnect transmission and distribution power systems and perform load flow computations in the hybrid network. In the largest example we managed to build, fifty copies of a distribution network consisting of fifteen nodes is connected to the UCTE study model, resulting in a system consisting

  4. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  5. WEB CONTENT EXTRACTION USING HYBRID APPROACH

    Directory of Open Access Journals (Sweden)

    K. Nethra

    2014-01-01

    Full Text Available The World Wide Web has rich source of voluminous and heterogeneous information which continues to expand in size and complexity. Many Web pages are unstructured and semi-structured, so it consists of noisy information like advertisement, links, headers, footers etc. This noisy information makes extraction of Web content tedious. Many techniques that were proposed for Web content extraction are based on automatic extraction and hand crafted rule generation. Automatic extraction technique is done through Web page segmentation, but it increases the time complexity. Hand crafted rule generation uses string manipulation function for rule generation, but generating those rules is very difficult. A hybrid approach is proposed to extract main content from Web pages. A HTML Web page is converted to DOM tree and features are extracted and with the extracted features, rules are generated. Decision tree classification and Naïve Bayes classification are machine learning methods used for rules generation. By using the rules, noisy part in the Web page is discarded and informative content in the Web page is extracted. The performance of both decision tree classification and Naïve Bayes classification are measured with metrics like precision, recall, F-measure and accuracy.

  6. Dry Port Location Problem: A Hybrid Multi-Criteria Approach

    Directory of Open Access Journals (Sweden)

    BENTALEB Fatimazahra

    2016-03-01

    Full Text Available Choosing a location for a dry port is a problem which becomes more essential and crucial. This study deals with the problem of locating dry ports. On this matter, a model combining multi-criteria (MACBETH and mono-criteria (BARYCENTER methods to find a solution to dry port location problem has been proposed. In the first phase, a systematic literature review was carried out on dry port location problem and then a methodological classification was presented for this research. In the second phase, a hybrid multi-criteria approach was developed in order to determine the best dry port location taking different criteria into account. A Computational practice and a qualitative analysis from a case study in the Moroccan context have been provided. The results show that the optimal location is very convenient with the geographical region and the government policies.

  7. A hybrid approach to simulate multiple photon scattering in X-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Freud, N. [CNDRI, Laboratory of Nondestructive Testing using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, avenue Albert Einstein, 69621 Villeurbanne Cedex (France)]. E-mail: nicolas.freud@insa-lyon.fr; Letang, J.-M. [CNDRI, Laboratory of Nondestructive Testing using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, avenue Albert Einstein, 69621 Villeurbanne Cedex (France); Babot, D. [CNDRI, Laboratory of Nondestructive Testing using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2005-01-01

    A hybrid simulation approach is proposed to compute the contribution of scattered radiation in X- or {gamma}-ray imaging. This approach takes advantage of the complementarity between the deterministic and probabilistic simulation methods. The proposed hybrid method consists of two stages. Firstly, a set of scattering events occurring in the inspected object is determined by means of classical Monte Carlo simulation. Secondly, this set of scattering events is used as a starting point to compute the energy imparted to the detector, with a deterministic algorithm based on a 'forced detection' scheme. For each scattering event, the probability for the scattered photon to reach each pixel of the detector is calculated using well-known physical models (form factor and incoherent scattering function approximations, in the case of Rayleigh and Compton scattering respectively). The results of the proposed hybrid approach are compared to those obtained with the Monte Carlo method alone (Geant4 code) and found to be in excellent agreement. The convergence of the results when the number of scattering events increases is studied. The proposed hybrid approach makes it possible to simulate the contribution of each type (Compton or Rayleigh) and order of scattering, separately or together, with a single PC, within reasonable computation times (from minutes to hours, depending on the number of pixels of the detector). This constitutes a substantial benefit, compared to classical simulation methods (Monte Carlo or deterministic approaches), which usually requires a parallel computing architecture to obtain comparable results.

  8. Use of a hybrid computer in engineering-seismology research

    Science.gov (United States)

    Park, R.B.; Hays, W.W.

    1977-01-01

    A hybrid computer is an important tool in the seismological research conducted by the U.S. Geological Survey in support of the Energy Research and Development Administration nuclear explosion testing program at the Nevada Test Site and the U.S. Geological Survey Earthquake Hazard Reduction Program. The hybrid computer system, which employs both digital and analog computational techniques, facilitates efficient seismic data processing. Standard data processing operations include: (1) preview of dubbed magnetic tapes of data; (2) correction of data for instrument response; (3) derivation of displacement and acceleration time histories from velocity recordings; (4) extraction of peak-amplitude data; (5) digitization of time histories; (6) rotation of instrumental axes; (7) derivation of response spectra; and (8) derivation of relative transfer functions between recording sites. Catalog of time histories and response spectra of ground motion from nuclear explosions and earthquakes that have been processed by the hybrid computer are used in the Earthquake Hazard Research Program to evaluate the effects of source, propagation path, and site effects on recorded ground motion; to assess seismic risk; to predict system response; and to solve system design problems.

  9. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  10. Special purpose hybrid transfinite elements and unified computational methodology for accurately predicting thermoelastic stress waves

    Science.gov (United States)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper represents an attempt to apply extensions of a hybrid transfinite element computational approach for accurately predicting thermoelastic stress waves. The applicability of the present formulations for capturing the thermal stress waves induced by boundary heating for the well known Danilovskaya problems is demonstrated. A unique feature of the proposed formulations for applicability to the Danilovskaya problem of thermal stress waves in elastic solids lies in the hybrid nature of the unified formulations and the development of special purpose transfinite elements in conjunction with the classical Galerkin techniques and transformation concepts. Numerical test cases validate the applicability and superior capability to capture the thermal stress waves induced due to boundary heating.

  11. Application of Computational Intelligence in Order to Develop Hybrid Orbit Propagation Methods

    Directory of Open Access Journals (Sweden)

    Iván Pérez

    2013-01-01

    Full Text Available We present a new approach in astrodynamics and celestial mechanics fields, called hybrid perturbation theory. A hybrid perturbation theory combines an integrating technique, general perturbation theory or special perturbation theory or semianalytical method, with a forecasting technique, statistical time series model or computational intelligence method. This combination permits an increase in the accuracy of the integrating technique, through the modeling of higher-order terms and other external forces not considered in the integrating technique. In this paper, neural networks have been used as time series forecasters in order to help two economic general perturbation theories describe the motion of an orbiter only perturbed by the Earth’s oblateness.

  12. Solving University Scheduling Problem Using Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Aftab Ahmed Shaikh

    2011-10-01

    Full Text Available In universities scheduling curriculum activity is an essential job. Primarily, scheduling is a distribution of limited resources under interrelated constraints. The set of hard constraints demand the highest priority and should not to be violated at any cost, while the maximum soft constraints satisfaction mounts the quality scale of solution. In this research paper, a novel bisected approach is introduced that is comprisesd of GA (Genetic Algorithm as well as Backtracking Recursive Search. The employed technique deals with both hard and soft constraints successively. The first phase decisively is focused over elimination of all the hard constraints bounded violations and eventually produces partial solution for subsequent step. The second phase is supposed to draw the best possible solution on the search space. Promising results are obtained by implementation on the real dataset. The key points of the research approach are to get assurance of hard constraints removal from the dataset and minimizing computational time for GA by initializing pre-processed set of chromosomes.

  13. A Hybrid Brain-Computer Interface-Based Mail Client

    Directory of Open Access Journals (Sweden)

    Tianyou Yu

    2013-01-01

    Full Text Available Brain-computer interface-based communication plays an important role in brain-computer interface (BCI applications; electronic mail is one of the most common communication tools. In this study, we propose a hybrid BCI-based mail client that implements electronic mail communication by means of real-time classification of multimodal features extracted from scalp electroencephalography (EEG. With this BCI mail client, users can receive, read, write, and attach files to their mail. Using a BCI mouse that utilizes hybrid brain signals, that is, motor imagery and P300 potential, the user can select and activate the function keys and links on the mail client graphical user interface (GUI. An adaptive P300 speller is employed for text input. The system has been tested with 6 subjects, and the experimental results validate the efficacy of the proposed method.

  14. Computational simulation of intermingled-fiber hybrid composite behavior

    Science.gov (United States)

    Mital, Subodh K.; Chamis, Christos C.

    1992-01-01

    Three-dimensional finite-element analysis and a micromechanics based computer code ICAN (Integrated Composite Analyzer) are used to predict the composite properties and microstresses of a unidirectional graphite/epoxy primary composite with varying percentages of S-glass fibers used as hydridizing fibers at a total fiber volume of 0.54. The three-dimensional finite-element model used in the analyses consists of a group of nine fibers, all unidirectional, in a three-by-three unit cell array. There is generally good agreement between the composite properties and microstresses obtained from both methods. The results indicate that the finite-element methods and the micromechanics equations embedded in the ICAN computer code can be used to obtain the properties of intermingled fiber hybrid composites needed for the analysis/design of hybrid composite structures. However, the finite-element model should be big enough to be able to simulate the conditions assumed in the micromechanics equations.

  15. Hybrid perovskites: Approaches towards light-emitting devices

    KAUST Repository

    Alias, Mohd Sharizal

    2016-10-06

    The high optical gain and absorption of organic-inorganic hybrid perovskites have attracted extensive research for photonic device applications. Using the bromide halide as an example, we present key approaches of our work towards realizing efficient perovskites based light-emitters. The approaches involved determination of optical constants for the hybrid perovskites thin films, fabrication of photonic nanostructures in the form of subwavelength grating reflector patterned directly on the hybrid perovskites as light manipulation layer, and enhancing the emission property of the hybrid perovskites by using microcavity structure. Our results provide a platform for realization of hybrid perovskites based light-emitting devices for solid-state lighting and display applications. © 2016 IEEE.

  16. Accelerating Climate and Weather Simulations through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  17. Hybrid x-space: a new approach for MPI reconstruction

    Science.gov (United States)

    Tateo, A.; Iurino, A.; Settanni, G.; Andrisani, A.; Stifanelli, P. F.; Larizza, P.; Mazzia, F.; Mininni, R. M.; Tangaro, S.; Bellotti, R.

    2016-06-01

    Magnetic particle imaging (MPI) is a new medical imaging technique capable of recovering the distribution of superparamagnetic particles from their measured induced signals. In literature there are two main MPI reconstruction techniques: measurement-based (MB) and x-space (XS). The MB method is expensive because it requires a long calibration procedure as well as a reconstruction phase that can be numerically costly. On the other side, the XS method is simpler than MB but the exact knowledge of the field free point (FFP) motion is essential for its implementation. Our simulation work focuses on the implementation of a new approach for MPI reconstruction: it is called hybrid x-space (HXS), representing a combination of the previous methods. Specifically, our approach is based on XS reconstruction because it requires the knowledge of the FFP position and velocity at each time instant. The difference with respect to the original XS formulation is how the FFP velocity is computed: we estimate it from the experimental measurements of the calibration scans, typical of the MB approach. Moreover, a compressive sensing technique is applied in order to reduce the calibration time, setting a fewer number of sampling positions. Simulations highlight that HXS and XS methods give similar results. Furthermore, an appropriate use of compressive sensing is crucial for obtaining a good balance between time reduction and reconstructed image quality. Our proposal is suitable for open geometry configurations of human size devices, where incidental factors could make the currents, the fields and the FFP trajectory irregular.

  18. Solving Problems in Various Domains by Hybrid Models of High Performance Computations

    Directory of Open Access Journals (Sweden)

    Yurii Rogozhin

    2014-03-01

    Full Text Available This work presents a hybrid model of high performance computations. The model is based on membrane system (P~system where some membranes may contain quantum device that is triggered by the data entering the membrane. This model is supposed to take advantages of both biomolecular and quantum paradigms and to overcome some of their inherent limitations. The proposed approach is demonstrated through two selected problems: SAT, and image retrieving.

  19. Computer Algebra, Instrumentation and the Anthropological Approach

    Science.gov (United States)

    Monaghan, John

    2007-01-01

    This article considers research and scholarship on the use of computer algebra in mathematics education following the instrumentation and the anthropological approaches. It outlines what these approaches are, positions them with regard to other approaches, examines tensions between the two approaches and makes suggestions for how work in this…

  20. Petascale computation performance of lightweight multiscale cardiac models using hybrid programming models.

    Science.gov (United States)

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-01-01

    Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.

  1. Computational approaches for urban environments

    NARCIS (Netherlands)

    Helbich, M; Jokar Arsanjani, J; Leitner, M

    2015-01-01

    This book aims to promote the synergistic usage of advanced computational methodologies in close relationship to geospatial information across cities of different scales. A rich collection of chapters subsumes current research frontiers originating from disciplines such as geography, urban planning,

  2. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too

  3. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too bro

  4. Evaluation of a Compact Hybrid Brain-Computer Interface System

    Directory of Open Access Journals (Sweden)

    Jaeyoung Shin

    2017-01-01

    Full Text Available We realized a compact hybrid brain-computer interface (BCI system by integrating a portable near-infrared spectroscopy (NIRS device with an economical electroencephalography (EEG system. The NIRS array was located on the subjects’ forehead, covering the prefrontal area. The EEG electrodes were distributed over the frontal, motor/temporal, and parietal areas. The experimental paradigm involved a Stroop word-picture matching test in combination with mental arithmetic (MA and baseline (BL tasks, in which the subjects were asked to perform either MA or BL in response to congruent or incongruent conditions, respectively. We compared the classification accuracies of each of the modalities (NIRS or EEG with that of the hybrid system. We showed that the hybrid system outperforms the unimodal EEG and NIRS systems by 6.2% and 2.5%, respectively. Since the proposed hybrid system is based on portable platforms, it is not confined to a laboratory environment and has the potential to be used in real-life situations, such as in neurorehabilitation.

  5. Energy efficient hybrid computing systems using spin devices

    Science.gov (United States)

    Sharad, Mrigank

    Emerging spin-devices like magnetic tunnel junctions (MTJ's), spin-valves and domain wall magnets (DWM) have opened new avenues for spin-based logic design. This work explored potential computing applications which can exploit such devices for higher energy-efficiency and performance. The proposed applications involve hybrid design schemes, where charge-based devices supplement the spin-devices, to gain large benefits at the system level. As an example, lateral spin valves (LSV) involve switching of nanomagnets using spin-polarized current injection through a metallic channel such as Cu. Such spin-torque based devices possess several interesting properties that can be exploited for ultra-low power computation. Analog characteristic of spin current facilitate non-Boolean computation like majority evaluation that can be used to model a neuron. The magneto-metallic neurons can operate at ultra-low terminal voltage of ˜20mV, thereby resulting in small computation power. Moreover, since nano-magnets inherently act as memory elements, these devices can facilitate integration of logic and memory in interesting ways. The spin based neurons can be integrated with CMOS and other emerging devices leading to different classes of neuromorphic/non-Von-Neumann architectures. The spin-based designs involve `mixed-mode' processing and hence can provide very compact and ultra-low energy solutions for complex computation blocks, both digital as well as analog. Such low-power, hybrid designs can be suitable for various data processing applications like cognitive computing, associative memory, and currentmode on-chip global interconnects. Simulation results for these applications based on device-circuit co-simulation framework predict more than ˜100x improvement in computation energy as compared to state of the art CMOS design, for optimal spin-device parameters.

  6. Hamiltonian approach to hybrid plasma models

    CERN Document Server

    Tronci, Cesare

    2010-01-01

    The Hamiltonian structures of several hybrid kinetic-fluid models are identified explicitly, upon considering collisionless Vlasov dynamics for the hot particles interacting with a bulk fluid. After presenting different pressure-coupling schemes for an ordinary fluid interacting with a hot gas, the paper extends the treatment to account for a fluid plasma interacting with an energetic ion species. Both current-coupling and pressure-coupling MHD schemes are treated extensively. In particular, pressure-coupling schemes are shown to require a transport-like term in the Vlasov kinetic equation, in order for the Hamiltonian structure to be preserved. The last part of the paper is devoted to studying the more general case of an energetic ion species interacting with a neutralizing electron background (hybrid Hall-MHD). Circulation laws and Casimir functionals are presented explicitly in each case.

  7. Hybrid silicon evanescent approach to optical interconnects

    OpenAIRE

    Liang, Di; Fang, Alexander W.; Chen, Hui-Wen; Sysak, Matthew N; Koch, Brian R.; Lively, Erica; Raday, Omri; Kuo, Ying-hao; Jones, Richard; Bowers, John E

    2009-01-01

    We discuss the recently developed hybrid silicon evanescent platform (HSEP), and its application as a promising candidate for optical interconnects in silicon. A number of key discrete components and a wafer-scale integration process are reviewed. The motivation behind this work is to realize silicon-based photonic integrated circuits possessing unique advantages of III–V materials and silicon-on-insulator waveguides simultaneously through a complementary metal-oxide semiconductor fabrication...

  8. A hybrid neurogenetic approach for stock forecasting.

    Science.gov (United States)

    Kwon, Yung-Keun; Moon, Byung-Ro

    2007-05-01

    In this paper, we propose a hybrid neurogenetic system for stock trading. A recurrent neural network (NN) having one hidden layer is used for the prediction model. The input features are generated from a number of technical indicators being used by financial experts. The genetic algorithm (GA) optimizes the NN's weights under a 2-D encoding and crossover. We devised a context-based ensemble method of NNs which dynamically changes on the basis of the test day's context. To reduce the time in processing mass data, we parallelized the GA on a Linux cluster system using message passing interface. We tested the proposed method with 36 companies in NYSE and NASDAQ for 13 years from 1992 to 2004. The neurogenetic hybrid showed notable improvement on the average over the buy-and-hold strategy and the context-based ensemble further improved the results. We also observed that some companies were more predictable than others, which implies that the proposed neurogenetic hybrid can be used for financial portfolio construction.

  9. Antenna arrays a computational approach

    CERN Document Server

    Haupt, Randy L

    2010-01-01

    This book covers a wide range of antenna array topics that are becoming increasingly important in wireless applications, particularly in design and computer modeling. Signal processing and numerical modeling algorithms are explored, and MATLAB computer codes are provided for many of the design examples. Pictures of antenna arrays and components provided by industry and government sources are presented with explanations of how they work. Antenna Arrays is a valuable reference for practicing engineers and scientists in wireless communications, radar, and remote sensing, and an excellent textbook for advanced antenna courses.

  10. A simulation approach to sizing hybrid photovoltaic and wind systems

    Science.gov (United States)

    Anderson, L. A.

    1983-12-01

    A simulation approach to sizing hybrid photovoltaic and wind systems provides a combination of components to realize zero downtime and minimum initial or life-cycle cost. Using Dayton, OH as a test site for weather data, cost advantages in the neighborhood of four are predicted for a hybrid system with battery storage when compared to a wind-energy-only system for the same electrical load.

  11. GRID COMPUTING AND CHECKPOINT APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj gupta

    2011-05-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occurring in the system, fault detection techniques and the recovery techniques used. A fault can occur due to link failure, resource failure or by any other reason is to be tolerated for working the system smoothly and accurately. These faults can be detected and recovered by many techniques used accordingly. An appropriate fault detector can avoid loss due to system crash and reliable fault tolerance technique can save from system failure. This paper provides how these methods are applied to detect and tolerate faults from various Real Time Distributed Systems. The advantages of utilizing the check pointing functionality are obvious; however so far the Grid community has notdeveloped a widely accepted standard that would allow the Gridenvironment to consciously utilize low level check pointing packages.Therefore, such a standard named Grid Check pointing Architecture isbeing designed. The fault tolerance mechanism used here sets the jobcheckpoints based on the resource failure rate. If resource failureoccurs, the job is restarted from its last successful state using acheckpoint file from another grid resource. A critical aspect for anautomatic recovery is the availability of checkpoint files. A strategy to increase the availability of checkpoints is replication. Grid is a form distributed computing mainly to virtualizes and utilize geographically distributed idle resources. A grid is a distributed computational and storage environment often composed of

  12. Immune based computer virus detection approaches

    Institute of Scientific and Technical Information of China (English)

    TAN Ying; ZHANG Pengtao

    2013-01-01

    The computer virus is considered one of the most horrifying threats to the security of computer systems worldwide.The rapid development of evasion techniques used in virus causes the signature based computer virus detection techniques to be ineffective.Many novel computer virus detection approaches have been proposed in the past to cope with the ineffectiveness,mainly classified into three categories:static,dynamic and heuristics techniques.As the natural similarities between the biological immune system (BIS),computer security system (CSS),and the artificial immune system (AIS) were all developed as a new prototype in the community of anti-virus research.The immune mechanisms in the BIS provide the opportunities to construct computer virus detection models that are robust and adaptive with the ability to detect unseen viruses.In this paper,a variety of classic computer virus detection approaches were introduced and reviewed based on the background knowledge of the computer virus history.Next,a variety of immune based computer virus detection approaches were also discussed in detail.Promising experimental results suggest that the immune based computer virus detection approaches were able to detect new variants and unseen viruses at lower false positive rates,which have paved a new way for the anti-virus research.

  13. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  14. Hybrid silicon evanescent approach to optical interconnects

    Science.gov (United States)

    Liang, Di; Fang, Alexander W.; Chen, Hui-Wen; Sysak, Matthew N.; Koch, Brian R.; Lively, Erica; Raday, Omri; Kuo, Ying-Hao; Jones, Richard; Bowers, John E.

    2009-06-01

    We discuss the recently developed hybrid silicon evanescent platform (HSEP), and its application as a promising candidate for optical interconnects in silicon. A number of key discrete components and a wafer-scale integration process are reviewed. The motivation behind this work is to realize silicon-based photonic integrated circuits possessing unique advantages of III-V materials and silicon-on-insulator waveguides simultaneously through a complementary metal-oxide semiconductor fabrication process. Electrically pumped hybrid silicon distributed feedback and distributed Bragg reflector lasers with integrated hybrid silicon photodetectors are demonstrated coupled to SOI waveguides, serving as the reliable on-chip single-frequency light sources. For the external signal processing, Mach-Zehnder interferometer modulators are demonstrated, showing a resistance-capacitance-limited, 3 dB electrical bandwidth up to 8 GHz and a modulation efficiency of 1.5 V mm. The successful implementation of quantum well intermixing technique opens up the possibility to realize multiple III-V bandgaps in this platform. Sampled grating DBR devices integrated with electroabsorption modulators (EAM) are fabricated, where the bandgaps in gain, mirror, and EAM regions are 1520, 1440 and 1480 nm, respectively. The high-temperature operation characteristics of the HSEP are studied experimentally and theoretically. An overall characteristic temperature ( T 0) of 51°C, an above threshold characteristic temperature ( T 1) of 100°C, and a thermal impedance ( Z T ) of 41.8°C/W, which agrees with the theoretical prediction of 43.5°C/W, are extracted from the Fabry-Perot devices. Scaling this platform to larger dimensions is demonstrated up to 150 mm wafer diameter. A vertical outgassing channel design is developed to accomplish high-quality III-V epitaxial transfer to silicon in a timely and dimension-independent fashion.

  15. Infectious disease modeling a hybrid system approach

    CERN Document Server

    Liu, Xinzhi

    2017-01-01

    This volume presents infectious diseases modeled mathematically, taking seasonality and changes in population behavior into account, using a switched and hybrid systems framework. The scope of coverage includes background on mathematical epidemiology, including classical formulations and results; a motivation for seasonal effects and changes in population behavior, an investigation into term-time forced epidemic models with switching parameters, and a detailed account of several different control strategies. The main goal is to study these models theoretically and to establish conditions under which eradication or persistence of the disease is guaranteed. In doing so, the long-term behavior of the models is determined through mathematical techniques from switched systems theory. Numerical simulations are also given to augment and illustrate the theoretical results and to help study the efficacy of the control schemes.

  16. Sentiment Analysis Using Hybrid Approach: A Survey

    Directory of Open Access Journals (Sweden)

    Chauhan Ashish P

    2015-01-01

    Full Text Available Sentiment analysis is the process of identifying people’s attitude and emotional state’s from language. The main objective is realized by identifying a set of potential features in the review and extracting opinion expressions about those features by exploiting their associations. Opinion mining, also known as Sentiment analysis, plays an important role in this process. It is the study of emotions i.e. Sentiments, Expressionsthat are stated in natural language. Natural language techniques are applied to extract emotions from unstructured data. There are several techniques which can be used to analysis such type of data. Here, we are categorizing these techniques broadly as ”supervised learning”, ”unsupervised learning” and ”hybrid techniques”. The objective of this paper is to provide the overview of Sentiment Analysis, their challenges and a comparative analysis of it’s techniques in the field of Natural Language Processing

  17. Computational approaches for systems metabolomics.

    Science.gov (United States)

    Krumsiek, Jan; Bartel, Jörg; Theis, Fabian J

    2016-06-01

    Systems genetics is defined as the simultaneous assessment and analysis of multi-omics datasets. In the past few years, metabolomics has been established as a robust tool describing an important functional layer in this approach. The metabolome of a biological system represents an integrated state of genetic and environmental factors and has been referred to as a 'link between genotype and phenotype'. In this review, we summarize recent progresses in statistical analysis methods for metabolomics data in combination with other omics layers. We put a special focus on complex, multivariate statistical approaches as well as pathway-based and network-based analysis methods. Moreover, we outline current challenges and pitfalls of metabolomics-focused multi-omics analyses and discuss future steps for the field.

  18. Hybrid NN/SVM Computational System for Optimizing Designs

    Science.gov (United States)

    Rai, Man Mohan

    2009-01-01

    A computational method and system based on a hybrid of an artificial neural network (NN) and a support vector machine (SVM) (see figure) has been conceived as a means of maximizing or minimizing an objective function, optionally subject to one or more constraints. Such maximization or minimization could be performed, for example, to optimize solve a data-regression or data-classification problem or to optimize a design associated with a response function. A response function can be considered as a subset of a response surface, which is a surface in a vector space of design and performance parameters. A typical example of a design problem that the method and system can be used to solve is that of an airfoil, for which a response function could be the spatial distribution of pressure over the airfoil. In this example, the response surface would describe the pressure distribution as a function of the operating conditions and the geometric parameters of the airfoil. The use of NNs to analyze physical objects in order to optimize their responses under specified physical conditions is well known. NN analysis is suitable for multidimensional interpolation of data that lack structure and enables the representation and optimization of a succession of numerical solutions of increasing complexity or increasing fidelity to the real world. NN analysis is especially useful in helping to satisfy multiple design objectives. Feedforward NNs can be used to make estimates based on nonlinear mathematical models. One difficulty associated with use of a feedforward NN arises from the need for nonlinear optimization to determine connection weights among input, intermediate, and output variables. It can be very expensive to train an NN in cases in which it is necessary to model large amounts of information. Less widely known (in comparison with NNs) are support vector machines (SVMs), which were originally applied in statistical learning theory. In terms that are necessarily

  19. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  20. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  1. Recent advances on hybrid approaches for designing intelligent systems

    CERN Document Server

    Melin, Patricia; Pedrycz, Witold; Kacprzyk, Janusz

    2014-01-01

    This book describes recent advances on hybrid intelligent systems using soft computing techniques for diverse areas of application, such as intelligent control and robotics, pattern recognition, time series prediction and optimization complex problems. Soft Computing (SC) consists of several intelligent computing paradigms, including fuzzy logic, neural networks, and bio-inspired optimization algorithms, which can be used to produce powerful hybrid intelligent systems. The book is organized in five main parts, which contain a group of papers around a similar subject. The first part consists of papers with the main theme of type-2 fuzzy logic, which basically consists of papers that propose new models and applications for type-2 fuzzy systems. The second part contains papers with the main theme of bio-inspired optimization algorithms, which are basically papers using nature-inspired techniques to achieve optimization of complex optimization problems in diverse areas of application. The third part contains pape...

  2. A Hybrid Optimization Approach for SRM FINOCYL Grain Design

    Institute of Scientific and Technical Information of China (English)

    Khurram Nisar; Liang Guozhu; Qasim Zeeshan

    2008-01-01

    This article presents a method to design and optimize 3D FINOCYL grain (FCG) configuration for solid rocket motors (SRMs). The design process of FCG configuration involves mathematical modeling of the geometry and parametric evaluation of various inde-pendent geometric variables that define the complex configuration. Vh'tually infinite combinations of these variables will satisfy the requirements of mass of propellant, thrust, and burning time in addition to satisfying basic needs for volumetric loading fraction and web fraction. In order to ensure the acquisition of the best possible design to be acquired, a sound approach of design and optimization is essentially demanded. To meet this need, a method is introduced to acquire the finest possible performance. A series of computations are carried out to formulate the grain geometry in terms of various combinations of key shapes inclusive of ellipsoid, cone, cylinder, sphere, torus, and inclined plane. A hybrid optimization (HO) technique is established by associating genetic algorithm (GA) for global solution convergence with sequential quadratic programming (SQP) for further local convergence of the solution, thus achieving the final optimal design. A comparison of the optimal design results derived from SQP, GA, and HO algorithms is presented. By using HO technique, the parameter of propellant mass is optimized to the minimum value with the required level of thrust staying within the constrained burning time, nozzle and propellant parameters, and a fixed length and outer diameter of grain,

  3. Hybrid Quantum-Classical Approach to Quantum Optimal Control.

    Science.gov (United States)

    Li, Jun; Yang, Xiaodong; Peng, Xinhua; Sun, Chang-Pu

    2017-04-14

    A central challenge in quantum computing is to identify more computational problems for which utilization of quantum resources can offer significant speedup. Here, we propose a hybrid quantum-classical scheme to tackle the quantum optimal control problem. We show that the most computationally demanding part of gradient-based algorithms, namely, computing the fitness function and its gradient for a control input, can be accomplished by the process of evolution and measurement on a quantum simulator. By posing queries to and receiving answers from the quantum simulator, classical computing devices update the control parameters until an optimal control solution is found. To demonstrate the quantum-classical scheme in experiment, we use a seven-qubit nuclear magnetic resonance system, on which we have succeeded in optimizing state preparation without involving classical computation of the large Hilbert space evolution.

  4. A Hybrid Approach to the Optimization of Multiechelon Systems

    Directory of Open Access Journals (Sweden)

    Paweł Sitek

    2015-01-01

    Full Text Available In freight transportation there are two main distribution strategies: direct shipping and multiechelon distribution. In the direct shipping, vehicles, starting from a depot, bring their freight directly to the destination, while in the multiechelon systems, freight is delivered from the depot to the customers through an intermediate points. Multiechelon systems are particularly useful for logistic issues in a competitive environment. The paper presents a concept and application of a hybrid approach to modeling and optimization of the Multi-Echelon Capacitated Vehicle Routing Problem. Two ways of mathematical programming (MP and constraint logic programming (CLP are integrated in one environment. The strengths of MP and CLP in which constraints are treated in a different way and different methods are implemented and combined to use the strengths of both. The proposed approach is particularly important for the discrete decision models with an objective function and many discrete decision variables added up in multiple constraints. An implementation of hybrid approach in the ECLiPSe system using Eplex library is presented. The Two-Echelon Capacitated Vehicle Routing Problem (2E-CVRP and its variants are shown as an illustrative example of the hybrid approach. The presented hybrid approach will be compared with classical mathematical programming on the same benchmark data sets.

  5. A Hybrid Approach for Co-Channel Speech Segregation based on CASA, HMM Multipitch Tracking, and Medium Frame Harmonic Model

    Directory of Open Access Journals (Sweden)

    Ashraf M. Mohy Eldin

    2013-08-01

    Full Text Available This paper proposes a hybrid approach for co-channel speech segregation. HMM (hidden Markov model is used to track the pitches of 2 talkers. The resulting pitch tracks are then enriched with the prominent pitch. The enriched tracks are correctly grouped using pitch continuity. Medium frame harmonics are used to extract the second pitch for frames with only one pitch deduced using the previous steps. Finally, the pitch tracks are input to CASA (computational auditory scene analysis to segregate the mixed speech. The center frequency range of the gamma tone filter banks is maximized to reduce the overlap between the channels filtered for better segregation. Experiments were conducted using this hybrid approach on the speech separation challenge database and compared to the single (non-hybrid approaches, i.e. signal processing and CASA. Results show that using the hybrid approach outperforms the single approaches.

  6. A Linear Approach for Depth and Colour Camera Calibration Using Hybrid Parameters

    Institute of Scientific and Technical Information of China (English)

    Ke-Li Cheng; Xuan Ju; Ruo-Feng Tong; Min Tang; Jian Chang; Jian-Jun Zhang

    2016-01-01

    Many recent applications of computer graphics and human computer interaction have adopted both colour cameras and depth cameras as input devices. Therefore, an effective calibration of both types of hardware taking different colour and depth inputs is required. Our approach removes the numerical difficulties of using non-linear optimization in previous methods which explicitly resolve camera intrinsics as well as the transformation between depth and colour cameras. A matrix of hybrid parameters is introduced to linearize our optimization. The hybrid parameters offer a transformation from a depth parametric space (depth camera image) to a colour parametric space (colour camera image) by combining the intrinsic parameters of depth camera and a rotation transformation from depth camera to colour camera. Both the rotation transformation and intrinsic parameters can be explicitly calculated from our hybrid parameters with the help of a standard QR factorisation. We test our algorithm with both synthesized data and real-world data where ground-truth depth information is captured by Microsoft Kinect. The experiments show that our approach can provide comparable accuracy of calibration with the state-of-the-art algorithms while taking much less computation time (1/50 of Herrera’s method and 1/10 of Raposo’s method) due to the advantage of using hybrid parameters.

  7. An Efficient Approach for Computing Silhouette Coefficients

    Directory of Open Access Journals (Sweden)

    Moh'd B. Al- Zoubi

    2008-01-01

    Full Text Available One popular approach for finding the best number of clusters (K in a data set is through computing the silhouette coefficients. The silhouette coefficients for different values of K, are first found and then the maximum value of these coefficients is chosen. However, computing the silhouette coefficient for different Ks is a very time consuming process. This is due to the amount of CPU time spent on distance calculations. A proposed approach to compute the silhouette coefficient quickly had been presented. The approach was based on decreasing the number of addition operations when computing distances. The results were efficient and more than 50% of the CPU time was achieved when applied to different data sets.

  8. A novel Monte Carlo approach to hybrid local volatility models

    NARCIS (Netherlands)

    A.W. van der Stoep (Anton); L.A. Grzelak (Lech Aleksander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractWe present in a Monte Carlo simulation framework, a novel approach for the evaluation of hybrid local volatility [Risk, 1994, 7, 18–20], [Int. J. Theor. Appl. Finance, 1998, 1, 61–110] models. In particular, we consider the stochastic local volatility model—see e.g. Lipton et al. [Quant.

  9. A hybrid generative-discriminative approach to speaker diarization

    NARCIS (Netherlands)

    Noulas, A.K.; van Kasteren, T.; Kröse, B.J.A.

    2008-01-01

    In this paper we present a sound probabilistic approach to speaker diarization. We use a hybrid framework where a distribution over the number of speakers at each point of a multimodal stream is estimated with a discriminative model. The output of this process is used as input in a generative model

  10. A hybrid approach to analyse a beam-soil structure under a moving random load

    Science.gov (United States)

    Si, L. T.; Zhao, Y.; Zhang, Y. H.; Kennedy, D.

    2016-11-01

    To study the stochastic response of a beam-soil structure under a moving random load, a hybrid approach based on the pseudo-excitation method and the wavelet method is proposed. Using the pseudo-excitation method, the non-stationary random vibration analysis is transformed into a conventional moving harmonic load problem. Analytical solutions of the power spectral density and standard deviation of vertical displacement are derived in an integral form. However, the integrand is singular and highly oscillatory, and the computational time is an important consideration because a large number of frequency points must be computed. To calculate the response accurately and efficiently, a wavelet approach is introduced. Numerical results show that the frequency band which brings the most significant response is dependent on the load velocity. The hybrid method provides a useful tool to estimate the ground vibration caused by traffic loads.

  11. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  12. Approaches to Low Fuel Regression Rate in Hybrid Rocket Engines

    OpenAIRE

    Dario Pastrone

    2012-01-01

    Hybrid rocket engines are promising propulsion systems which present appealing features such as safety, low cost, and environmental friendliness. On the other hand, certain issues hamper the development hoped for. The present paper discusses approaches addressing improvements to one of the most important among these issues: low fuel regression rate. To highlight the consequence of such an issue and to better understand the concepts proposed, fundamentals are summarized. Two approaches are pre...

  13. Maze learning by a hybrid brain-computer system

    Science.gov (United States)

    Wu, Zhaohui; Zheng, Nenggan; Zhang, Shaowu; Zheng, Xiaoxiang; Gao, Liqiang; Su, Lijuan

    2016-09-01

    The combination of biological and artificial intelligence is particularly driven by two major strands of research: one involves the control of mechanical, usually prosthetic, devices by conscious biological subjects, whereas the other involves the control of animal behaviour by stimulating nervous systems electrically or optically. However, to our knowledge, no study has demonstrated that spatial learning in a computer-based system can affect the learning and decision making behaviour of the biological component, namely a rat, when these two types of intelligence are wired together to form a new intelligent entity. Here, we show how rule operations conducted by computing components contribute to a novel hybrid brain-computer system, i.e., ratbots, exhibit superior learning abilities in a maze learning task, even when their vision and whisker sensation were blocked. We anticipate that our study will encourage other researchers to investigate combinations of various rule operations and other artificial intelligence algorithms with the learning and memory processes of organic brains to develop more powerful cyborg intelligence systems. Our results potentially have profound implications for a variety of applications in intelligent systems and neural rehabilitation.

  14. Computational fluid dynamics challenges for hybrid air vehicle applications

    Science.gov (United States)

    Carrin, M.; Biava, M.; Steijl, R.; Barakos, G. N.; Stewart, D.

    2017-06-01

    This paper begins by comparing turbulence models for the prediction of hybrid air vehicle (HAV) flows. A 6 : 1 prolate spheroid is employed for validation of the computational fluid dynamics (CFD) method. An analysis of turbulent quantities is presented and the Shear Stress Transport (SST) k-ω model is compared against a k-ω Explicit Algebraic Stress model (EASM) within the unsteady Reynolds-Averaged Navier-Stokes (RANS) framework. Further comparisons involve Scale Adaptative Simulation models and a local transition transport model. The results show that the flow around the vehicle at low pitch angles is sensitive to transition effects. At high pitch angles, the vortices generated on the suction side provide substantial lift augmentation and are better resolved by EASMs. The validated CFD method is employed for the flow around a shape similar to the Airlander aircraft of Hybrid Air Vehicles Ltd. The sensitivity of the transition location to the Reynolds number is demonstrated and the role of each vehicle£s component is analyzed. It was found that the ¦ns contributed the most to increase the lift and drag.

  15. Universal quantum computation using all-optical hybrid encoding

    Institute of Scientific and Technical Information of China (English)

    郭奇; 程留永; 王洪福; 张寿

    2015-01-01

    By employing displacement operations, single-photon subtractions, and weak cross-Kerr nonlinearity, we propose an alternative way of implementing several universal quantum logical gates for all-optical hybrid qubits encoded in both single-photon polarization state and coherent state. Since these schemes can be straightforwardly implemented only using local operations without teleportation procedure, therefore, less physical resources and simpler operations are required than the existing schemes. With the help of displacement operations, a large phase shift of the coherent state can be obtained via currently available tiny cross-Kerr nonlinearity. Thus, all of these schemes are nearly deterministic and feasible under current technology conditions, which makes them suitable for large-scale quantum computing.

  16. "Hybrids" and the Gendering of Computing Jobs in Australia

    Directory of Open Access Journals (Sweden)

    Gillian Whitehouse

    2005-05-01

    Full Text Available This paper presents recent Australian evidence on the extent to which women are entering “hybrid” computing jobs combining technical and communication or “people management” skills, and the way these skill combinations are valued at organisational level. We draw on a survey of detailed occupational roles in large IT firms to examine the representation of women in a range of jobs consistent with the notion of “hybrid”, and analyse the discourse around these sorts of skills in a set of organisational case studies. Our research shows a traditional picture of labour market segmentation, with limited representation of women in high status jobs, and their relatively greater prevalence in more routine areas of the industry. While our case studies highlight perceptions of the need for hybrid roles and assumptions about the suitability of women for such jobs, the ongoing masculinity of core development functions appears untouched by this discourse.

  17. A hybrid approach for probabilistic forecasting of electricity price

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Wang, Yelei

    2014-01-01

    The electricity market plays a key role in realizing the economic prophecy of smart grids. Accurate and reliable electricity market price forecasting is essential to facilitate various decision making activities of market participants in the future smart grid environment. However, due...... to probabilistic interval forecasts can be of great importance to quantify the uncertainties of potential forecasts, thus effectively supporting the decision making activities against uncertainties and risks ahead. This paper proposes a hybrid approach to construct prediction intervals of MCPs with a two...... electricity price forecasting is proposed in this paper. The effectiveness of the proposed hybrid method has been validated through comprehensive tests using real price data from Australian electricity market....

  18. Body Fat Percentage Prediction Using Intelligent Hybrid Approaches

    Directory of Open Access Journals (Sweden)

    Yuehjen E. Shao

    2014-01-01

    Full Text Available Excess of body fat often leads to obesity. Obesity is typically associated with serious medical diseases, such as cancer, heart disease, and diabetes. Accordingly, knowing the body fat is an extremely important issue since it affects everyone’s health. Although there are several ways to measure the body fat percentage (BFP, the accurate methods are often associated with hassle and/or high costs. Traditional single-stage approaches may use certain body measurements or explanatory variables to predict the BFP. Diverging from existing approaches, this study proposes new intelligent hybrid approaches to obtain fewer explanatory variables, and the proposed forecasting models are able to effectively predict the BFP. The proposed hybrid models consist of multiple regression (MR, artificial neural network (ANN, multivariate adaptive regression splines (MARS, and support vector regression (SVR techniques. The first stage of the modeling includes the use of MR and MARS to obtain fewer but more important sets of explanatory variables. In the second stage, the remaining important variables are served as inputs for the other forecasting methods. A real dataset was used to demonstrate the development of the proposed hybrid models. The prediction results revealed that the proposed hybrid schemes outperformed the typical, single-stage forecasting models.

  19. Viscous QCD matter in a hybrid hydrodynamic+Boltzmann approach

    CERN Document Server

    Song, Huichao; Heinz, Ulrich W

    2010-01-01

    A hybrid transport approach for the bulk evolution of viscous QCD matter produced in ultra-relativistic heavy-ion collisions is presented. The expansion of the dense deconfined phase of the reaction is modeled with viscous hydrodynamics while the dilute late hadron gas stage is described microscopically by the Boltzmann equation. The advantages of such a hybrid approach lie in the improved capability of handling large dissipative corrections in the late dilute phase of the reaction, including a realistic treatment of the non-equilibrium hadronic chemistry and kinetic freeze-out. By varying the switching temperature at which the hydrodynamic output is converted to particles for further propagation with the Boltzmann cascade we test the ability of the macroscopic hydrodynamic approach to emulate the microscopic evolution during the hadronic stage and extract the temperature dependence of the effective shear viscosity of the hadron resonance gas produced in the collision. We find that the extracted values depend...

  20. A Big Data Approach to Computational Creativity

    CERN Document Server

    Varshney, Lav R; Varshney, Kush R; Bhattacharjya, Debarun; Schoergendorfer, Angela; Chee, Yi-Min

    2013-01-01

    Computational creativity is an emerging branch of artificial intelligence that places computers in the center of the creative process. Broadly, creativity involves a generative step to produce many ideas and a selective step to determine the ones that are the best. Many previous attempts at computational creativity, however, have not been able to achieve a valid selective step. This work shows how bringing data sources from the creative domain and from hedonic psychophysics together with big data analytics techniques can overcome this shortcoming to yield a system that can produce novel and high-quality creative artifacts. Our data-driven approach is demonstrated through a computational creativity system for culinary recipes and menus we developed and deployed, which can operate either autonomously or semi-autonomously with human interaction. We also comment on the volume, velocity, variety, and veracity of data in computational creativity.

  1. Towards Lagrangian approach to quantum computations

    CERN Document Server

    Vlasov, A Yu

    2003-01-01

    In this work is discussed possibility and actuality of Lagrangian approach to quantum computations. Finite-dimensional Hilbert spaces used in this area provide some challenge for such consideration. The model discussed here can be considered as an analogue of Weyl quantization of field theory via path integral in L. D. Faddeev's approach. Weyl quantization is possible to use also in finite-dimensional case, and some formulas may be simply rewritten with change of integrals to finite sums. On the other hand, there are specific difficulties relevant to finite case. This work has some allusions with phase space models of quantum computations developed last time by different authors.

  2. A hybrid clustering approach to recognition of protein families in 114 microbial genomes

    Directory of Open Access Journals (Sweden)

    Gogarten J Peter

    2004-04-01

    Full Text Available Abstract Background Grouping proteins into sequence-based clusters is a fundamental step in many bioinformatic analyses (e.g., homology-based prediction of structure or function. Standard clustering methods such as single-linkage clustering capture a history of cluster topologies as a function of threshold, but in practice their usefulness is limited because unrelated sequences join clusters before biologically meaningful families are fully constituted, e.g. as the result of matches to so-called promiscuous domains. Use of the Markov Cluster algorithm avoids this non-specificity, but does not preserve topological or threshold information about protein families. Results We describe a hybrid approach to sequence-based clustering of proteins that combines the advantages of standard and Markov clustering. We have implemented this hybrid approach over a relational database environment, and describe its application to clustering a large subset of PDB, and to 328577 proteins from 114 fully sequenced microbial genomes. To demonstrate utility with difficult problems, we show that hybrid clustering allows us to constitute the paralogous family of ATP synthase F1 rotary motor subunits into a single, biologically interpretable hierarchical grouping that was not accessible using either single-linkage or Markov clustering alone. We describe validation of this method by hybrid clustering of PDB and mapping SCOP families and domains onto the resulting clusters. Conclusion Hybrid (Markov followed by single-linkage clustering combines the advantages of the Markov Cluster algorithm (avoidance of non-specific clusters resulting from matches to promiscuous domains and single-linkage clustering (preservation of topological information as a function of threshold. Within the individual Markov clusters, single-linkage clustering is a more-precise instrument, discerning sub-clusters of biological relevance. Our hybrid approach thus provides a computationally efficient

  3. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  4. A Hybrid Strong/Weak Coupling Approach to Jet Quenching

    CERN Document Server

    Casalderrey-Solana, Jorge; Milhano, José Guilherme; Pablos, Daniel; Rajagopal, Krishna

    2014-01-01

    We propose and explore a new hybrid approach to jet quenching in a strongly coupled medium. The basis of this phenomenological approach is to treat physics processes at different energy scales differently. The high-$Q^2$ processes associated with the QCD evolution of the jet from production as a single hard parton through its fragmentation, up to but not including hadronization, are treated perturbatively. The interactions between the partons in the shower and the deconfined matter within which they find themselves lead to energy loss. The momentum scales associated with the medium (of the order of the temperature) and with typical interactions between partons in the shower and the medium are sufficiently soft that strongly coupled physics plays an important role in energy loss. We model these interactions using qualitative insights from holographic calculations of the energy loss of energetic light quarks and gluons in a strongly coupled plasma, obtained via gauge/gravity duality. We embed this hybrid model ...

  5. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    Institute of Scientific and Technical Information of China (English)

    Junaid Ali Khan; Muhammad Asif Zahoor Raja; Ijaz Mansoor Qureshi

    2011-01-01

    @@ We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs).The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error.The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique.The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations.We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods.The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy.With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.%We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.

  6. An efficient hybrid causative event-based approach for deriving the annual flood frequency distribution

    Science.gov (United States)

    Thyer, Mark; Li, Jing; Lambert, Martin; Kuczera, George; Metcalfe, Andrew

    2015-04-01

    Flood extremes are driven by highly variable and complex climatic and hydrological processes. Derived flood frequency methods are often used to predict the flood frequency distribution (FFD) because they can provide predictions in ungauged catchments and evaluate the impact of land-use or climate change. This study presents recent work on development of a new derived flood frequency method called the hybrid causative events (HCE) approach. The advantage of the HCE approach is that it combines the accuracy of the continuous simulation approach with the computational efficiency of the event-based approaches. Derived flood frequency methods, can be divided into two classes. Event-based approaches provide fast estimation, but can also lead to prediction bias due to limitations of inherent assumptions required for obtaining input information (rainfall and catchment wetness) for events that cause large floods. Continuous simulation produces more accurate predictions, however, at the cost of massive computational time. The HCE method uses a short continuous simulation to provide inputs for a rainfall-runoff model running in an event-based fashion. A proof-of-concept pilot study that the HCE produces estimates of the flood frequency distribution with similar accuracy as the continuous simulation, but with dramatically reduced computation time. Recent work incorporated seasonality into the HCE approach and evaluated with a more realistic set of eight sites from a wide range of climate zones, typical of Australia, using a virtual catchment approach. The seasonal hybrid-CE provided accurate predictions of the FFD for all sites. Comparison with the existing non-seasonal hybrid-CE showed that for some sites the non-seasonal hybrid-CE significantly over-predicted the FFD. Analysis of the underlying cause of whether a site had a high, low or no need to use seasonality found it was based on a combination of reasons, that were difficult to predict apriori. Hence it is recommended

  7. A hybrid model for the computationally-efficient simulation of the cerebellar granular layer

    Directory of Open Access Journals (Sweden)

    Anna eCattani

    2016-04-01

    Full Text Available The aim of the present paper is to efficiently describe the membrane potential dynamics of neural populations formed by species having a high density difference in specific brain areas. We propose a hybrid model whose main ingredients are a conductance-based model (ODE system and its continuous counterpart (PDE system obtained through a limit process in which the number of neurons confined in a bounded region of the brain tissue is sent to infinity. Specifically, in the discrete model, each cell is described by a set of time-dependent variables, whereas in the continuum model, cells are grouped into populations that are described by a set of continuous variables.Communications between populations, which translate into interactions among the discrete and the continuous models, are the essence of the hybrid model we present here. The cerebellum and cerebellum-like structures show in their granular layer a large difference in the relative density of neuronal species making them a natural testing ground for our hybrid model. By reconstructing the ensemble activity of the cerebellar granular layer network and by comparing our results to a more realistic computational network, we demonstrate that our description of the network activity, even though it is not biophysically detailed, is still capable of reproducing salient features of neural network dynamics. Our modeling approach yields a significant computational cost reduction by increasing the simulation speed at least $270$ times. The hybrid model reproduces interesting dynamics such as local microcircuit synchronization, traveling waves, center-surround and time-windowing.

  8. Diagnosing Hybrid Systems: a Bayesian Model Selection Approach

    Science.gov (United States)

    McIlraith, Sheila A.

    2005-01-01

    In this paper we examine the problem of monitoring and diagnosing noisy complex dynamical systems that are modeled as hybrid systems-models of continuous behavior, interleaved by discrete transitions. In particular, we examine continuous systems with embedded supervisory controllers that experience abrupt, partial or full failure of component devices. Building on our previous work in this area (MBCG99;MBCG00), our specific focus in this paper ins on the mathematical formulation of the hybrid monitoring and diagnosis task as a Bayesian model tracking algorithm. The nonlinear dynamics of many hybrid systems present challenges to probabilistic tracking. Further, probabilistic tracking of a system for the purposes of diagnosis is problematic because the models of the system corresponding to failure modes are numerous and generally very unlikely. To focus tracking on these unlikely models and to reduce the number of potential models under consideration, we exploit logic-based techniques for qualitative model-based diagnosis to conjecture a limited initial set of consistent candidate models. In this paper we discuss alternative tracking techniques that are relevant to different classes of hybrid systems, focusing specifically on a method for tracking multiple models of nonlinear behavior simultaneously using factored sampling and conditional density propagation. To illustrate and motivate the approach described in this paper we examine the problem of monitoring and diganosing NASA's Sprint AERCam, a small spherical robotic camera unit with 12 thrusters that enable both linear and rotational motion.

  9. A Hybrid Data Mining Approach for Intrusion Detection on Imbalanced NSL-KDD Dataset

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Parsaei

    2016-06-01

    Full Text Available Intrusion detection systems aim to detect malicious viruses from computer and network traffic, which is not possible using common firewall. Most intrusion detection systems are developed based on machine learning techniques. Since datasets which used in intrusion detection are imbalanced, in the previous methods, the accuracy of detecting two attack classes, R2L and U2R, is lower than that of the normal and other attack classes. In order to overcome this issue, this study employs a hybrid approach. This hybrid approach is a combination of synthetic minority oversampling technique (SMOTE and cluster center and nearest neighbor (CANN. Important features are selected using leave one out method (LOO. Moreover, this study employs NSL KDD dataset. Results indicate that the proposed method improves the accuracy of detecting U2R and R2L attacks in comparison to the baseline paper by 94% and 50%, respectively.

  10. Secured Authorized Data Using Hybrid Encryption in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dinesh Shinde

    2017-03-01

    Full Text Available In today’s world to provide a security to a public network like a cloud network is become a toughest task however more likely to reduce the cost at the time of providing security using cryptographic technique to delegate the mask of the decryption task to the cloud servers to reduce the computing cost. As a result, attributebased encryption with delegation emerges. Still, there are caveats and questions remaining in the previous relevant works. For to solution to all problems the cloud servers could tamper or replace the delegated cipher text and respond a forged computing result with malicious intent. They may also cheat the eligible users by responding them that they are ineligible for the purpose of cost saving. Furthermore, during the encryption, the access policies may not be flexible enough as well. Since policy for general circuits enables to achieve the strongest form of access control, a construction for realizing circuit cipher text-policy attribute-based hybrid encryption with verifiable delegation has been considered in our work. In such a system, combined with verifiable computation and encrypt-then-mac mechanism, the data confidentiality, the fine-grained access control and the correctness of the delegated computing results are well guaranteed at the same time. Besides, our scheme achieves security against chosen-plaintext attacks under the k-multilinear Decisional Diffie-Hellman assumption. Moreover, an extensive simulation campaign confirms the feasibility and efficiency of the proposed solution. There are two complementary forms of attribute-based encryption. One is key-policy attribute-based encryption (KP-ABE [8], [9], [10], and the other is cipher text-policy attribute-based encryption. In a KP-ABE system, the decision of access policy is made by the key distributor instead of the enciphered, which limits the practicability and usability for the system in practical applicationsthe access policy for general circuits could be

  11. Handbook of computational approaches to counterterrorism

    CERN Document Server

    Subrahmanian, VS

    2012-01-01

    Terrorist groups throughout the world have been studied primarily through the use of social science methods. However, major advances in IT during the past decade have led to significant new ways of studying terrorist groups, making forecasts, learning models of their behaviour, and shaping policies about their behaviour. Handbook of Computational Approaches to Counterterrorism provides the first in-depth look at how advanced mathematics and modern computing technology is shaping the study of terrorist groups. This book includes contributions from world experts in the field, and presents extens

  12. A Hybrid Segmentation Framework for Computer-Assisted Dental Procedures

    Science.gov (United States)

    Hosntalab, Mohammad; Aghaeizadeh Zoroofi, Reza; Abbaspour Tehrani-Fard, Ali; Shirani, Gholamreza; Reza Asharif, Mohammad

    Teeth segmentation in computed tomography (CT) images is a major and challenging task for various computer assisted procedures. In this paper, we introduced a hybrid method for quantification of teeth in CT volumetric dataset inspired by our previous experiences and anatomical knowledge of teeth and jaws. In this regard, we propose a novel segmentation technique using an adaptive thresholding, morphological operations, panoramic re-sampling and variational level set algorithm. The proposed method consists of several steps as follows: first, we determine the operation region in CT slices. Second, the bony tissues are separated from other tissues by utilizing an adaptive thresholding technique based on the 3D pulses coupled neural networks (PCNN). Third, teeth tissue is classified from other bony tissues by employing panorex lines and anatomical knowledge of teeth in the jaws. In this case, the panorex lines are estimated using Otsu thresholding and mathematical morphology operators. Then, the proposed method is followed by calculating the orthogonal lines corresponding to panorex lines and panoramic re-sampling of the dataset. Separation of upper and lower jaws and initial segmentation of teeth are performed by employing the integral projections of the panoramic dataset. Based the above mentioned procedures an initial mask for each tooth is obtained. Finally, we utilize the initial mask of teeth and apply a variational level set to refine initial teeth boundaries to final contour. In the last step a surface rendering algorithm known as marching cubes (MC) is applied to volumetric visualization. The proposed algorithm was evaluated in the presence of 30 cases. Segmented images were compared with manually outlined contours. We compared the performance of segmentation method using ROC analysis of the thresholding, watershed and our previous works. The proposed method performed best. Also, our algorithm has the advantage of high speed compared to our previous works.

  13. Computational analysis on plug-in hybrid electric motorcycle chassis

    Science.gov (United States)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  14. Photo-Ionization of Noble Gases: A Demonstration of Hybrid Coupled Channels Approach

    Directory of Open Access Journals (Sweden)

    Vinay Pramod Majety

    2015-01-01

    Full Text Available We present here an application of the recently developed hybrid coupled channels approach to study photo-ionization of noble gas atoms: Neon and Argon. We first compute multi-photon ionization rates and cross-sections for these inert gas atoms with our approach and compare them with reliable data available from R-matrix Floquet theory. The good agreement between coupled channels and R-matrix Floquet theory show that our method treats multi-electron systems on par with the well established R-matrix theory. We then apply the time dependent surface flux (tSURFF method with our approach to compute total and angle resolved photo-electron spectra from Argon with linearly and circularly polarized 12 nm wavelength laser fields, a typical wavelength available from Free Electron Lasers (FELs.

  15. Comparing Hybrid Learning with Traditional Approaches on Learning the Microsoft Office Power Point 2003 Program in Tertiary Education

    Science.gov (United States)

    Vernadakis, Nikolaos; Antoniou, Panagiotis; Giannousi, Maria; Zetou, Eleni; Kioumourtzoglou, Efthimis

    2011-01-01

    The purpose of this study was to determine the effectiveness of a hybrid learning approach to deliver a computer science course concerning the Microsoft office PowerPoint 2003 program in comparison to delivering the same course content in the form of traditional lectures. A hundred and seventy-two first year university students were randomly…

  16. Two-dimensional magnetic modeling of ferromagnetic materials by using a neural networks based hybrid approach

    Energy Technology Data Exchange (ETDEWEB)

    Cardelli, E.; Faba, A. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Laudani, A.; Lozito, G.M.; Riganti Fulginei, F.; Salvini, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy)

    2016-04-01

    This paper presents a hybrid neural network approach to model magnetic hysteresis at macro-magnetic scale. That approach aims to be coupled together with numerical treatments of magnetic hysteresis such as FEM numerical solvers of the Maxwell's equations in time domain, as in case of the non-linear dynamic analysis of electrical machines, and other similar devices, allowing a complete computer simulation with acceptable run times. The proposed Hybrid Neural System consists of four inputs representing the magnetic induction and magnetic field components at each time step and it is trained by 2D and scalar measurements performed on the magnetic material to be modeled. The magnetic induction B is assumed as entry point and the output of the Hybrid Neural System returns the predicted value of the field H at the same time step. Within the Hybrid Neural System, a suitably trained neural network is used for predicting the hysteretic behavior of the material to be modeled. Validations with experimental tests and simulations for symmetric, non-symmetric and minor loops are presented.

  17. Modelling hybrid stars in quark-hadron approaches

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, S. [FIAS, Frankfurt am Main (Germany); Dexheimer, V. [Kent State University, Department of Physics, Kent, OH (United States); Negreiros, R. [Federal Fluminense University, Gragoata, Niteroi (Brazil)

    2016-01-15

    The density in the core of neutron stars can reach values of about 5 to 10 times nuclear matter saturation density. It is, therefore, a natural assumption that hadrons may have dissolved into quarks under such conditions, forming a hybrid star. This star will have an outer region of hadronic matter and a core of quark matter or even a mixed state of hadrons and quarks. In order to investigate such phases, we discuss different model approaches that can be used in the study of compact stars as well as being applicable to a wider range of temperatures and densities. One major model ingredient, the role of quark interactions in the stability of massive hybrid stars is discussed. In this context, possible conflicts with lattice QCD simulations are investigated. (orig.)

  18. Novel computational approaches characterizing knee physiotherapy

    OpenAIRE

    Wangdo Kim; Veloso, Antonio P; Duarte Araujo; Kohles, Sean S.

    2014-01-01

    A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physi...

  19. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  20. Hybrid simulation of scatter intensity in industrial cone-beam computed tomography

    Science.gov (United States)

    Thierry, R.; Miceli, A.; Hofmann, J.; Flisch, A.; Sennhauser, U.

    2009-01-01

    A cone-beam computed tomography (CT) system using a 450 kV X-ray tube has been developed to challenge the three-dimensional imaging of parts of the automotive industry in short acquisition time. Because the probability of detecting scattered photons is high regarding the energy range and the area of detection, a scattering correction becomes mandatory for generating reliable images with enhanced contrast detectability. In this paper, we present a hybrid simulator for the fast and accurate calculation of the scattering intensity distribution. The full acquisition chain, from the generation of a polyenergetic photon beam, its interaction with the scanned object and the energy deposit in the detector is simulated. Object phantoms can be spatially described in form of voxels, mathematical primitives or CAD models. Uncollided radiation is treated with a ray-tracing method and scattered radiation is split into single and multiple scattering. The single scattering is calculated with a deterministic approach accelerated with a forced detection method. The residual noisy signal is subsequently deconvoluted with the iterative Richardson-Lucy method. Finally the multiple scattering is addressed with a coarse Monte Carlo (MC) simulation. The proposed hybrid method has been validated on aluminium phantoms with varying size and object-to-detector distance, and found in good agreement with the MC code Geant4. The acceleration achieved by the hybrid method over the standard MC on a single projection is approximately of three orders of magnitude.

  1. Mixed model approaches for the identification of QTLs within a maize hybrid breeding program.

    Science.gov (United States)

    van Eeuwijk, Fred A; Boer, Martin; Totir, L Radu; Bink, Marco; Wright, Deanne; Winkler, Christopher R; Podlich, Dean; Boldman, Keith; Baumgarten, Andy; Smalley, Matt; Arbelbide, Martin; ter Braak, Cajo J F; Cooper, Mark

    2010-01-01

    Two outlines for mixed model based approaches to quantitative trait locus (QTL) mapping in existing maize hybrid selection programs are presented: a restricted maximum likelihood (REML) and a Bayesian Markov Chain Monte Carlo (MCMC) approach. The methods use the in-silico-mapping procedure developed by Parisseaux and Bernardo (2004) as a starting point. The original single-point approach is extended to a multi-point approach that facilitates interval mapping procedures. For computational and conceptual reasons, we partition the full set of relationships from founders to parents of hybrids into two types of relations by defining so-called intermediate founders. QTL effects are defined in terms of those intermediate founders. Marker based identity by descent relationships between intermediate founders define structuring matrices for the QTL effects that change along the genome. The dimension of the vector of QTL effects is reduced by the fact that there are fewer intermediate founders than parents. Furthermore, additional reduction in the number of QTL effects follows from the identification of founder groups by various algorithms. As a result, we obtain a powerful mixed model based statistical framework to identify QTLs in genetic backgrounds relevant to the elite germplasm of a commercial breeding program. The identification of such QTLs will provide the foundation for effective marker assisted and genome wide selection strategies. Analyses of an example data set show that QTLs are primarily identified in different heterotic groups and point to complementation of additive QTL effects as an important factor in hybrid performance.

  2. Broadband ground motion simulation using a paralleled hybrid approach of Frequency Wavenumber and Finite Difference method

    Science.gov (United States)

    Chen, M.; Wei, S.

    2016-12-01

    The serious damage of Mexico City caused by the 1985 Michoacan earthquake 400 km away indicates that urban areas may be affected by remote earthquakes. To asses earthquake risk of urban areas imposed by distant earthquakes, we developed a hybrid Frequency Wavenumber (FK) and Finite Difference (FD) code implemented with MPI, since the computation of seismic wave propagation from a distant earthquake using a single numerical method (e.g. Finite Difference, Finite Element or Spectral Element) is very expensive. In our approach, we compute the incident wave field (ud) at the boundaries of the excitation box, which surrounding the local structure, using a paralleled FK method (Zhu and Rivera, 2002), and compute the total wave field (u) within the excitation box using a parallelled 2D FD method. We apply perfectly matched layer (PML) absorbing condition to the diffracted wave field (u-ud). Compared to previous Generalized Ray Theory and Finite Difference (Wen and Helmberger, 1998), Frequency Wavenumber and Spectral Element (Tong et al., 2014), and Direct Solution Method and Spectral Element hybrid method (Monteiller et al., 2013), our absorbing boundary condition dramatically suppress the numerical noise. The MPI implementation of our method can greatly speed up the calculation. Besides, our hybrid method also has a potential use in high resolution array imaging similar to Tong et al. (2014).

  3. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  4. Computer-aided diagnosis system: a Bayesian hybrid classification method.

    Science.gov (United States)

    Calle-Alonso, F; Pérez, C J; Arias-Nicolás, J P; Martín, J

    2013-10-01

    A novel method to classify multi-class biomedical objects is presented. The method is based on a hybrid approach which combines pairwise comparison, Bayesian regression and the k-nearest neighbor technique. It can be applied in a fully automatic way or in a relevance feedback framework. In the latter case, the information obtained from both an expert and the automatic classification is iteratively used to improve the results until a certain accuracy level is achieved, then, the learning process is finished and new classifications can be automatically performed. The method has been applied in two biomedical contexts by following the same cross-validation schemes as in the original studies. The first one refers to cancer diagnosis, leading to an accuracy of 77.35% versus 66.37%, originally obtained. The second one considers the diagnosis of pathologies of the vertebral column. The original method achieves accuracies ranging from 76.5% to 96.7%, and from 82.3% to 97.1% in two different cross-validation schemes. Even with no supervision, the proposed method reaches 96.71% and 97.32% in these two cases. By using a supervised framework the achieved accuracy is 97.74%. Furthermore, all abnormal cases were correctly classified.

  5. A semiclassical hybrid approach to many particle quantum dynamics

    Science.gov (United States)

    Grossmann, Frank

    2006-07-01

    We analytically derive a correlated approach for a mixed semiclassical many particle dynamics, treating a fraction of the degrees of freedom by the multitrajectory semiclassical initial value method of Herman and Kluk [Chem. Phys. 91, 27 (1984)] while approximately treating the dynamics of the remaining degrees of freedom with fixed initial phase space variables, analogously to the thawed Gaussian wave packet dynamics of Heller [J. Chem. Phys. 62, 1544 (1975)]. A first application of this hybrid approach to the well studied Secrest-Johnson [J. Chem. Phys. 45, 4556 (1966)] model of atom-diatomic collisions is promising. Results close to the quantum ones for correlation functions as well as scattering probabilities could be gained with considerably reduced numerical effort as compared to the full semiclassical Herman-Kluk approach. Furthermore, the harmonic nature of the different degrees of freedom can be determined a posteriori by comparing results with and without the additional approximation.

  6. Approaches to Low Fuel Regression Rate in Hybrid Rocket Engines

    Directory of Open Access Journals (Sweden)

    Dario Pastrone

    2012-01-01

    Full Text Available Hybrid rocket engines are promising propulsion systems which present appealing features such as safety, low cost, and environmental friendliness. On the other hand, certain issues hamper the development hoped for. The present paper discusses approaches addressing improvements to one of the most important among these issues: low fuel regression rate. To highlight the consequence of such an issue and to better understand the concepts proposed, fundamentals are summarized. Two approaches are presented (multiport grain and high mixture ratio which aim at reducing negative effects without enhancing regression rate. Furthermore, fuel material changes and nonconventional geometries of grain and/or injector are presented as methods to increase fuel regression rate. Although most of these approaches are still at the laboratory or concept scale, many of them are promising.

  7. Indoor Wireless Localization-hybrid and Unconstrained Nonlinear Optimization Approach

    Directory of Open Access Journals (Sweden)

    R. Jayabharathy

    2013-07-01

    Full Text Available In this study, a hybrid TOA/RSSI wireless localization is proposed for accurate positioning in indoor UWB systems. The major problem in indoor localization is the effect of Non-Line of Sight (NLOS propagation. To mitigate the NLOS effects, an unconstrained nonlinear optimization approach is utilized to process Time-of-Arrival (TOA and Received Signal Strength (RSS in the location system.TOA range measurements and path loss model are used to discriminate LOS and NLOS conditions. The weighting factors assigned by hypothesis testing, is used for solving the objective function in the proposed approach. This approach is used for describing the credibility of the TOA range measurement. Performance of the proposed technique is done based on MATLAB simulation. The result shows that the proposed technique performs well and achieves improved positioning under severe NLOS conditions.

  8. A hybrid approach for efficient anomaly detection using metaheuristic methods

    Directory of Open Access Journals (Sweden)

    Tamer F. Ghanem

    2015-07-01

    Full Text Available Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  9. ANIBAL - a Hybrid Computer Language for EAI 680-PDP 8/I, FPP 12

    DEFF Research Database (Denmark)

    Højberg, Kristian Søe

    1974-01-01

    A hybrid programming language ANIBAL has been developed for use in an open-shop computing centre with an EAI-680 analog computer, a PDP8/I digital computer, and a FFP-12 floating point processor. An 8K core memory and 812k disk memory is included. The new language consists of standard FORTRAN IV...

  10. A Hybrid Data Association Approach for SLAM in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Baifan Chen

    2013-02-01

    Full Text Available Data association is critical for Simultaneous Localization and Mapping (SLAM. In a real environment, dynamic obstacles will lead to false data associations which compromise SLAM results. This paper presents a simple and effective data association method for SLAM in dynamic environments. A hybrid approach of data association based on local maps by combining ICNN and JCBB algorithms is used initially. Secondly, we set a judging condition of outlier features in association assumptions and then the static and dynamic features are detected according to spatial and temporal difference. Finally, association assumptions are updated by filtering out the dynamic features. Simulations and experimental results show that this method is feasible.

  11. Hybrid closure of atrial septal defect: A modified approach

    Directory of Open Access Journals (Sweden)

    Kshitij Sheth

    2015-01-01

    Full Text Available A 3.5-year-old girl underwent transcatheter closure of patent ductus arteriosus in early infancy during which time her secundum atrial septal defect (ASD was left alone. When she came for elective closure of ASD, she was found to have bilaterally blocked femoral veins. The defect was successfully closed with an Amplatzer septal occluder (ASO; St. Jude Medical, Plymouth, MN, USA using a hybrid approach via a sub-mammary mini-thoracotomy incision without using cardiopulmonary bypass. At the end of 1-year follow-up, the child is asymptomatic with device in a stable position without any residual shunt.

  12. Hybrid-system approach to fault-tolerant quantum communication

    Science.gov (United States)

    Stephens, Ashley M.; Huang, Jingjing; Nemoto, Kae; Munro, William J.

    2013-05-01

    We present a layered hybrid-system approach to quantum communication that involves the distribution of a topological cluster state throughout a quantum network. Photon loss and other errors are suppressed by optical multiplexing and entanglement purification. The scheme is scalable to large distances, achieving an end-to-end rate of 1 kHz with around 50 qubits per node. We suggest a potentially suitable implementation of an individual node composed of erbium spins (single atom or ensemble) coupled via flux qubits to a microwave resonator, allowing for deterministic local gates, stable quantum memories, and emission of photons in the telecom regime.

  13. [A hybrid approach to surgery for thoracic aortic aneurysm

    DEFF Research Database (Denmark)

    L., de la Motte; Baekgaard, N.; Jensen, L.P.

    2009-01-01

    A 57-year-old male, previously treated surgically with insertion of grafts for type A and B aortic dissection, presented with a pulsatile mass in the jugular fossa. Further examination verified a pseudoaneurysm the inlet of which was located at the proximal anastomotic site of the descending aortic...... graft and a newly developed aneurysm of the aortic arch. Using a left lateral thoracotomy to avoid manipulation of the pseudoaneurysm, we adopted a hybrid approach by first debranching the subclavian and carotid arteries from the descending aorta followed by endoluminal grafting of the aortic arch...

  14. Computer Forensics Education - the Open Source Approach

    Science.gov (United States)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  15. A hybrid approach for integrated healthcare cooperative purchasing and supply chain configuration.

    Science.gov (United States)

    Rego, Nazaré; Claro, João; Pinho de Sousa, Jorge

    2014-12-01

    This paper presents an innovative and flexible approach for recommending the number, size and composition of purchasing groups, for a set of hospitals willing to cooperate, while minimising their shared supply chain costs. This approach makes the financial impact of the various cooperation alternatives transparent to the group and the individual participants, opening way to a negotiation process concerning the allocation of the cooperation costs and gains. The approach was developed around a hybrid Variable Neighbourhood Search (VNS)/Tabu Search metaheuristic, resulting in a flexible tool that can be applied to purchasing groups with different characteristics, namely different operative and market circumstances, and to supply chains with different topologies and atypical cost characteristics. Preliminary computational results show the potential of the approach in solving a broad range of problems.

  16. A Hybrid Reduction Approach for Enhancing Cancer Classification of Microarray Data

    Directory of Open Access Journals (Sweden)

    Abeer M. Mahmoud

    2014-10-01

    Full Text Available This paper presents a novel hybrid machine learning (MLreduction approach to enhance cancer classification accuracy of microarray data based on two ML gene ranking techniques (T-test and Class Separability (CS. The proposed approach is integrated with two ML classifiers; K-nearest neighbor (KNN and support vector machine (SVM; for mining microarray gene expression profiles. Four public cancer microarray databases are used for evaluating the proposed approach and successfully accomplish the mining process. These are Lymphoma, Leukemia SRBCT, and Lung Cancer. The strategy to select genes only from the training samples and totally excluding the testing samples from the classifier building process is utilized for more accurate and validated results. Also, the computational experiments are illustrated in details and comprehensively presented with literature related results. The results showed that the proposed reduction approach reached promising results of the number of genes supplemented to the classifiers as well as the classification accuracy.

  17. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  18. An Approach to Ad hoc Cloud Computing

    CERN Document Server

    Kirby, Graham; Macdonald, Angus; Fernandes, Alvaro

    2010-01-01

    We consider how underused computing resources within an enterprise may be harnessed to improve utilization and create an elastic computing infrastructure. Most current cloud provision involves a data center model, in which clusters of machines are dedicated to running cloud infrastructure software. We propose an additional model, the ad hoc cloud, in which infrastructure software is distributed over resources harvested from machines already in existence within an enterprise. In contrast to the data center cloud model, resource levels are not established a priori, nor are resources dedicated exclusively to the cloud while in use. A participating machine is not dedicated to the cloud, but has some other primary purpose such as running interactive processes for a particular user. We outline the major implementation challenges and one approach to tackling them.

  19. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  20. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  1. Modelling the World Wool Market: A Hybrid Approach

    OpenAIRE

    2007-01-01

    We present a model of the world wool market that merges two modelling traditions: the partialequilibrium commodity-specific approach and the computable general-equilibrium approach. The model captures the multistage nature of the wool production system, and the heterogeneous nature of raw wool, processed wool and wool garments. It also captures the important wool producing and consuming regions of the world. We illustrate the utility of the model by estimating the effects of tariff barriers o...

  2. A hybrid multi-scale computational scheme for advection-diffusion-reaction equation

    Science.gov (United States)

    Karimi, S.; Nakshatrala, K. B.

    2016-12-01

    Simulation of transport and reaction processes in porous media and subsurface science has become more vital than ever. Over the past few decades, a variety of mathematical models and numerical methodologies for porous media simulations have been developed. As the demand for higher accuracy and validity of the models grows, the issue of disparate temporal and spatial scales becomes more problematic. The variety of reaction processes and complexity of pore geometry poses a huge computational burden in a real-world or reservoir scale simulation. Meanwhile, methods based on averaging or up- scaling techniques do not provide reliable estimates to pore-scale processes. To overcome this problem, development of hybrid and multi-scale computational techniques is considered a promising approach. In these methods, pore-scale and continuum-scale models are combined, hence, a more reliable estimate to pore-scale processes is obtained without having to deal with the tremendous computational overhead of pore-scale methods. In this presentation, we propose a computational framework that allows coupling of lattice Boltzmann method (for pore-scale simulation) and finite element method (for continuum-scale simulation) for advection-diffusion-reaction equations. To capture disparate in time and length events, non-matching grid and time-steps are allowed. Apart from application of this method to benchmark problems, multi-scale simulation of chemical reactions in porous media is also showcased.

  3. Many-Body Approach to Mesons, Hybrids and Glueballs

    CERN Document Server

    Cotanch, S R; Cotanch, Stephen R.; Llanes-Estrada, Felipe J.

    2000-01-01

    We represent QCD at the hadronic scale by means of an effective Hamiltonian, H, formulated in the Coulomb gauge. As in the Nambu-Jona-Lasinio model, chiral symmetry is dynamically broken, however our approach is renormalizable and also includes confinement through a linear potential with slope specified by lattice gauge theory. We perform a comparative study of alternative many-body techniques for approximately diagonalizing H: BCS for the vacuum ground state; TDA and RPA for the excited hadron states. We adequately describe the experimental meson and lattice glueball spectra and perform the first relativistic, three quasiparticle calculation for hybrid mesons. In general agreement with alternative theoretical approaches, we predict the lightest hybrid states near but above 2 GeV, indicating the two recently observed $J^{PC} = 1^{-+}$ exotics at 1.4 and 1.6 GeV are of a different, perhaps four quark, structure. We also detail a new isospin dependent interaction from $q\\bar{q}$ color octet annihilation (analog...

  4. A Hybrid Motion Compensation De-interlacing Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Motion compensation de-interlacing is expected to be better than linear techniques; but all the block-based motion compensation de-interlacing methods cause block artifacts. The algorithm proposed in this paper is concerned with reducing the deficiency of motion-compensated interpolation by using adaptive hybrid de-interlacing methods. A spatio-temporal tensor-based approach is used to get more accurate motion field for de-interlacing. Motion vector is assigned for each position with pixel precision; the block artifact is reduced significantly. To deal with the artifacts introduced by motion-compensation when the motion estimation is incorrect, linear techniques are considered by adaptive weighting. Furthermore, directional filter is adapted to preserve details and the edge discontinuity could be eliminated greatly. Our approach is robust to incorrect motion vector estimation.

  5. Automatic Facial Expression Recognition Based on Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Ali K. K. Bermani

    2012-12-01

    Full Text Available The topic of automatic recognition of facial expressions deduce a lot of researchers in the late last century and has increased a great interest in the past few years. Several techniques have emerged in order to improve the efficiency of the recognition by addressing problems in face detection and extraction features in recognizing expressions. This paper has proposed automatic system for facial expression recognition which consists of hybrid approach in feature extraction phase which represent a combination between holistic and analytic approaches by extract 307 facial expression features (19 features by geometric, 288 feature by appearance. Expressions recognition is performed by using radial basis function (RBF based on artificial neural network to recognize the six basic emotions (anger, fear, disgust, happiness, surprise, sadness in addition to the natural.The system achieved recognition rate 97.08% when applying on person-dependent database and 93.98% when applying on person-independent.

  6. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    Science.gov (United States)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  7. Hybrid Analysis Approach for Stochastic Response of Offshore Jacket Platforms

    Institute of Scientific and Technical Information of China (English)

    金伟良; 郑忠双; 李海波; 张立

    2000-01-01

    The dynamic response of offshore platforms is more serious in hostile sea environment than in shallow sea. In this paper, a hybrid solution combined with analytical and numerical method is proposed to compute the stochastic response of fixed offshore platforms to random waves, considering wave-structure interaction and non-linear drag force. The simulation program includes two steps: the first step is the eigenanalysis aspects associated the structure and the second step is response estimation based on spectral equations. The eigenanalysis could be done through conventional finite element method conveniently and its natural frequency and mode shapes obtained. In the second part of the process, the solution of the offshore structural response is obtained by iteration of a series of coupled spectral equations. Considering the third-order term in the drag force, the evaluation of the three-fold convolution should be demanded for nonlinear stochastic response analysis. To demonstrate this method, a numerical analysis is carried out for both linear and non-linear platform motions. The final response spectra have the typical two peaks in agreement with reality, indicating that the hybrid method is effective and can be applied to offshore engineering.

  8. Chimera: A hybrid approach to numerical loop quantum cosmology

    CERN Document Server

    Diener, Peter; Singh, Parampreet

    2013-01-01

    The existence of a quantum bounce in isotropic spacetimes is a key result in loop quantum cosmology (LQC), which has been demonstrated to arise in all the models studied so far. In most of the models, the bounce has been studied using numerical simulations involving states which are sharply peaked and which bounce at volumes much larger than the Planck volume. An important issue is to confirm the existence of the bounce for states which have a wide spread, or which bounce closer to the Planck volume. Numerical simulations with such states demand large computational domains, making them very expensive and practically infeasible with the techniques which have been implemented so far. To overcome these difficulties, we present an efficient hybrid numerical scheme using the property that at the small spacetime curvature, the quantum Hamiltonian constraint in LQC, which is a difference equation with uniform discretization in volume, can be approximated by a Wheeler-DeWitt differential equation. By carefully choosi...

  9. Hybrid Heuristic Approaches for Tactical Berth Allocation Problem

    DEFF Research Database (Denmark)

    Iris, Cagatay; Larsen, Allan; Pacino, Dario;

    Tactical berth allocation problem deals with: the berth allocation (as- signs and schedules vessels to berth-positions), and the quay crane (QC) assignment (finds number of QCs that will serve). In this work, we strengthen the current mathematical models (MM) with novel lower bounds and valid ine...... inequalities. And, we propose a hybrid heuristic which combines MM with greedy and search heuristics. Results show that problem can be solved efficiently respect to optimality and computational time.......Tactical berth allocation problem deals with: the berth allocation (as- signs and schedules vessels to berth-positions), and the quay crane (QC) assignment (finds number of QCs that will serve). In this work, we strengthen the current mathematical models (MM) with novel lower bounds and valid...

  10. Design and performance evaluation of dynamic wavelength scheduled hybrid WDM/TDM PON for distributed computing applications.

    Science.gov (United States)

    Zhu, Min; Guo, Wei; Xiao, Shilin; Dong, Yi; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2009-01-19

    This paper investigates the design and implementation of distributed computing applications in local area network. We propose a novel Dynamical Wavelength Scheduled Hybrid WDM/TDM Passive Optical Network, which is termed as DWS-HPON. The system is implemented by using spectrum slicing techniques of broadband light source and overlay broadcast-signaling scheme. The Time-Wavelength Co-Allocation (TWCA) Problem is defined and an effective greedy approach to this problem is presented for aggregating large files in distributed computing applications. The simulations demonstrate that the performance is improved significantly compared with the conventional TDM-over-WDM PON.

  11. Non-adaptive and adaptive hybrid approaches for enhancing water quality management

    Science.gov (United States)

    Kalwij, Ineke M.; Peralta, Richard C.

    2008-09-01

    parameter values for a new optimization problem can be time consuming. For comparison, AGA, AGCT, and GC are applied to optimize pumping rates for assumed well locations of a complex large-scale contaminant transport and remediation optimization problem at Blaine Naval Ammunition Depot (NAD). Both hybrid approaches converged more closely to the optimal solution than the non-hybrid AGA. GC averaged 18.79% better convergence than AGCT, and 31.9% than AGA, within the same computation time (12.5 days). AGCT averaged 13.1% better convergence than AGA. The GC can significantly reduce the burden of employing computationally intensive hydrologic simulation models within a limited time period and for real-world optimization problems. Although demonstrated for a groundwater quality problem, it is also applicable to other arenas, such as managing salt water intrusion and surface water contaminant loading.

  12. Reduced density matrix hybrid approach: application to electronic energy transfer.

    Science.gov (United States)

    Berkelbach, Timothy C; Markland, Thomas E; Reichman, David R

    2012-02-28

    Electronic energy transfer in the condensed phase, such as that occurring in photosynthetic complexes, frequently occurs in regimes where the energy scales of the system and environment are similar. This situation provides a challenge to theoretical investigation since most approaches are accurate only when a certain energetic parameter is small compared to others in the problem. Here we show that in these difficult regimes, the Ehrenfest approach provides a good starting point for a dynamical description of the energy transfer process due to its ability to accurately treat coupling to slow environmental modes. To further improve on the accuracy of the Ehrenfest approach, we use our reduced density matrix hybrid framework to treat the faster environmental modes quantum mechanically, at the level of a perturbative master equation. This combined approach is shown to provide an efficient and quantitative description of electronic energy transfer in a model dimer and the Fenna-Matthews-Olson complex and is used to investigate the effect of environmental preparation on the resulting dynamics.

  13. Model-Invariant Hybrid Computations of Separated Flows for RCA Standard Test Cases

    Science.gov (United States)

    Woodruff, Stephen

    2016-01-01

    NASA's Revolutionary Computational Aerosciences (RCA) subproject has identified several smooth-body separated flows as standard test cases to emphasize the challenge these flows present for computational methods and their importance to the aerospace community. Results of computations of two of these test cases, the NASA hump and the FAITH experiment, are presented. The computations were performed with the model-invariant hybrid LES-RANS formulation, implemented in the NASA code VULCAN-CFD. The model- invariant formulation employs gradual LES-RANS transitions and compensation for model variation to provide more accurate and efficient hybrid computations. Comparisons revealed that the LES-RANS transitions employed in these computations were sufficiently gradual that the compensating terms were unnecessary. Agreement with experiment was achieved only after reducing the turbulent viscosity to mitigate the effect of numerical dissipation. The stream-wise evolution of peak Reynolds shear stress was employed as a measure of turbulence dynamics in separated flows useful for evaluating computations.

  14. A Hybrid Satellite-Terrestrial Approach to Aeronautical Communication Networks

    Science.gov (United States)

    Kerczewski, Robert J.; Chomos, Gerald J.; Griner, James H.; Mainger, Steven W.; Martzaklis, Konstantinos S.; Kachmar, Brian A.

    2000-01-01

    Rapid growth in air travel has been projected to continue for the foreseeable future. To maintain a safe and efficient national and global aviation system, significant advances in communications systems supporting aviation are required. Satellites will increasingly play a critical role in the aeronautical communications network. At the same time, current ground-based communications links, primarily very high frequency (VHF), will continue to be employed due to cost advantages and legacy issues. Hence a hybrid satellite-terrestrial network, or group of networks, will emerge. The increased complexity of future aeronautical communications networks dictates that system-level modeling be employed to obtain an optimal system fulfilling a majority of user needs. The NASA Glenn Research Center is investigating the current and potential future state of aeronautical communications, and is developing a simulation and modeling program to research future communications architectures for national and global aeronautical needs. This paper describes the primary requirements, the current infrastructure, and emerging trends of aeronautical communications, including a growing role for satellite communications. The need for a hybrid communications system architecture approach including both satellite and ground-based communications links is explained. Future aeronautical communication network topologies and key issues in simulation and modeling of future aeronautical communications systems are described.

  15. A hybrid optimization approach in non-isothermal glass molding

    Science.gov (United States)

    Vu, Anh-Tuan; Kreilkamp, Holger; Krishnamoorthi, Bharathwaj Janaki; Dambon, Olaf; Klocke, Fritz

    2016-10-01

    Intensively growing demands on complex yet low-cost precision glass optics from the today's photonic market motivate the development of an efficient and economically viable manufacturing technology for complex shaped optics. Against the state-of-the-art replication-based methods, Non-isothermal Glass Molding turns out to be a promising innovative technology for cost-efficient manufacturing because of increased mold lifetime, less energy consumption and high throughput from a fast process chain. However, the selection of parameters for the molding process usually requires a huge effort to satisfy precious requirements of the molded optics and to avoid negative effects on the expensive tool molds. Therefore, to reduce experimental work at the beginning, a coupling CFD/FEM numerical modeling was developed to study the molding process. This research focuses on the development of a hybrid optimization approach in Non-isothermal glass molding. To this end, an optimal configuration with two optimization stages for multiple quality characteristics of the glass optics is addressed. The hybrid Back-Propagation Neural Network (BPNN)-Genetic Algorithm (GA) is first carried out to realize the optimal process parameters and the stability of the process. The second stage continues with the optimization of glass preform using those optimal parameters to guarantee the accuracy of the molded optics. Experiments are performed to evaluate the effectiveness and feasibility of the model for the process development in Non-isothermal glass molding.

  16. Adaptation of hybrid human-computer interaction systems using EEG error-related potentials.

    Science.gov (United States)

    Chavarriaga, Ricardo; Biasiucci, Andrea; Forster, Killian; Roggen, Daniel; Troster, Gerhard; Millan, Jose Del R

    2010-01-01

    Performance improvement in both humans and artificial systems strongly relies in the ability of recognizing erroneous behavior or decisions. This paper, that builds upon previous studies on EEG error-related signals, presents a hybrid approach for human computer interaction that uses human gestures to send commands to a computer and exploits brain activity to provide implicit feedback about the recognition of such commands. Using a simple computer game as a case study, we show that EEG activity evoked by erroneous gesture recognition can be classified in single trials above random levels. Automatic artifact rejection techniques are used, taking into account that subjects are allowed to move during the experiment. Moreover, we present a simple adaptation mechanism that uses the EEG signal to label newly acquired samples and can be used to re-calibrate the gesture recognition system in a supervised manner. Offline analysis show that, although the achieved EEG decoding accuracy is far from being perfect, these signals convey sufficient information to significantly improve the overall system performance.

  17. Punjabi to Hindi Transliteration System for Proper Nouns Using Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Er.Sahil Malhan ,

    2015-11-01

    Full Text Available The language is an effective medium for the communication that conveys the ideas and expression of the human mind. There are more than 5000 languages in the world for the communication. To know all these languages is not a solution for problems due to the language barrier in communication. In this multilingual world with the huge amount of information exchanged between various regions and in different languages in digitized format, it has become necessary to find an automated process to convert from one language to another. Natural Language Processing (NLP is one of the hot areas of research that explores how computers can be utilizing to understand and manipulate natural language text or speech. In the Proposed system a Hybrid approach to transliterate the proper nouns from Punjabi to Hindi is developed. Hybrid approach in the proposed system is a combination of Direct Mapping, Rule based approach and Statistical Machine Translation approach (SMT. Proposed system is tested on various proper nouns from different domains and accuracy of the proposed system is very good.

  18. Intraply Hybrid Composite Design

    Science.gov (United States)

    Chamis, C. C.; Sinclair, J. H.

    1986-01-01

    Several theoretical approaches combined in program. Intraply hybrid composites investigated theoretically and experimentally at Lewis Research Center. Theories developed during investigations and corroborated by attendant experiments used to develop computer program identified as INHYD (Intraply Hybrid Composite Design). INHYD includes several composites micromechanics theories, intraply hybrid composite theories, and integrated hygrothermomechanical theory. Equations from theories used by program as appropriate for user's specific applications.

  19. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  20. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  1. A hybrid CPU-GPGPU approach for real-time elastography.

    Science.gov (United States)

    Yang, Xu; Deka, Sthiti; Righetti, Raffaella

    2011-12-01

    Ultrasound elastography is becoming a widely available clinical imaging tool. In recent years, several real- time elastography algorithms have been proposed; however, most of these algorithms achieve real-time frame rates through compromises in elastographic image quality. Cross-correlation- based elastographic techniques are known to provide high- quality elastographic estimates, but they are computationally intense and usually not suitable for real-time clinical applications. Recently, the use of massively parallel general purpose graphics processing units (GPGPUs) for accelerating computationally intense operations in biomedical applications has received great interest. In this study, we investigate the use of the GPGPU to speed up generation of cross-correlation-based elastograms and achieve real-time frame rates while preserving elastographic image quality. We propose and statistically analyze performance of a new hybrid model of computation suitable for elastography applications in which sequential code is executed on the CPU and parallel code is executed on the GPGPU. Our results indicate that the proposed hybrid approach yields optimal results and adequately addresses the trade-off between speed and quality.

  2. OPTIMIZATION APPROACH FOR HYBRID ELECTRIC VEHICLE POWERTRAIN DESIGN

    Institute of Scientific and Technical Information of China (English)

    Zhu Zhengli; Zhang Jianwu; Yin Chengliang

    2005-01-01

    According to bench test results of fuel economy and engine emission for the real powertrain system of EQ7200HEV car, a 3-D performance map oriented quasi-linear model is developed for the configuration of the powertrain components such as internal combustion engine, traction electric motor, transmission, main retarder and energy storage unit. A genetic algorithm based on optimization procedure is proposed and applied for parametric optimization of the key components by consideration of requirements of some driving cycles. Through comparison of numerical results obtained by the genetic algorithm with those by traditional optimization methods, it is shown that the present approach is quite effective and efficient in emission reduction and fuel economy for the design of the hybrid electric car powertrain.

  3. A Hybrid Ensemble Learning Approach to Star-Galaxy Classification

    CERN Document Server

    Kim, Edward J; Kind, Matias Carrasco

    2015-01-01

    There exist a variety of star-galaxy classification techniques, each with their own strengths and weaknesses. In this paper, we present a novel meta-classification framework that combines and fully exploits different techniques to produce a more robust star-galaxy classification. To demonstrate this hybrid, ensemble approach, we combine a purely morphological classifier, a supervised machine learning method based on random forest, an unsupervised machine learning method based on self-organizing maps, and a hierarchical Bayesian template fitting method. Using data from the CFHTLenS survey, we consider different scenarios: when a high-quality training set is available with spectroscopic labels from DEEP2, SDSS, VIPERS, and VVDS, and when the demographics of sources in a low-quality training set do not match the demographics of objects in the test data set. We demonstrate that our Bayesian combination technique improves the overall performance over any individual classification method in these scenarios. Thus, s...

  4. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  5. Detecting awareness in patients with disorders of consciousness using a hybrid brain-computer interface

    Science.gov (United States)

    Pan, Jiahui; Xie, Qiuyou; He, Yanbin; Wang, Fei; Di, Haibo; Laureys, Steven; Yu, Ronghao; Li, Yuanqing

    2014-10-01

    Objective. The bedside detection of potential awareness in patients with disorders of consciousness (DOC) currently relies only on behavioral observations and tests; however, the misdiagnosis rates in this patient group are historically relatively high. In this study, we proposed a visual hybrid brain-computer interface (BCI) combining P300 and steady-state evoked potential (SSVEP) responses to detect awareness in severely brain injured patients. Approach. Four healthy subjects, seven DOC patients who were in a vegetative state (VS, n = 4) or minimally conscious state (MCS, n = 3), and one locked-in syndrome (LIS) patient attempted a command-following experiment. In each experimental trial, two photos were presented to each patient; one was the patient's own photo, and the other photo was unfamiliar. The patients were instructed to focus on their own or the unfamiliar photos. The BCI system determined which photo the patient focused on with both P300 and SSVEP detections. Main results. Four healthy subjects, one of the 4 VS, one of the 3 MCS, and the LIS patient were able to selectively attend to their own or the unfamiliar photos (classification accuracy, 66-100%). Two additional patients (one VS and one MCS) failed to attend the unfamiliar photo (50-52%) but achieved significant accuracies for their own photo (64-68%). All other patients failed to show any significant response to commands (46-55%). Significance. Through the hybrid BCI system, command following was detected in four healthy subjects, two of 7 DOC patients, and one LIS patient. We suggest that the hybrid BCI system could be used as a supportive bedside tool to detect awareness in patients with DOC.

  6. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  7. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  8. A hybrid modelling approach to simulating foot-and-mouth disease outbreaks in Australian livestock

    Directory of Open Access Journals (Sweden)

    Richard A Bradhurst

    2015-03-01

    Full Text Available Foot-and-mouth disease (FMD is a highly contagious and economically important viral disease of cloven-hoofed animals. Australia's freedom from FMD underpins a valuable trade in live animals and animal products. An outbreak of FMD would result in the loss of export markets and cause severe disruption to domestic markets. The prevention of, and contingency planning for, FMD are of key importance to government, industry, producers and the community. The spread and control of FMD is complex and dynamic due to a highly contagious multi-host pathogen operating in a heterogeneous environment across multiple jurisdictions. Epidemiological modelling is increasingly being recognized as a valuable tool for investigating the spread of disease under different conditions and the effectiveness of control strategies. Models of infectious disease can be broadly classified as: population-based models that are formulated from the top-down and employ population-level relationships to describe individual-level behaviour, individual-based models that are formulated from the bottom-up and aggregate individual-level behaviour to reveal population-level relationships, or hybrid models which combine the two approaches into a single model.The Australian Animal Disease Spread (AADIS hybrid model employs a deterministic equation-based model (EBM to model within-herd spread of FMD, and a stochastic, spatially-explicit agent-based model (ABM to model between-herd spread and control. The EBM provides concise and computationally efficient predictions of herd prevalence and clinical signs over time. The ABM captures the complex, stochastic and heterogeneous environment in which an FMD epidemic operates. The AADIS event-driven hybrid EBM/ABM architecture is a flexible, efficient and extensible framework for modelling the spread and control of disease in livestock on a national scale. We present an overview of the AADIS hybrid approach and a description of the model

  9. Hybrid hierarchical bio-based materials: Development and characterization through experimentation and computational simulations

    Science.gov (United States)

    Haq, Mahmoodul

    Environmentally friendly bio-based composites with improved properties can be obtained by harnessing the synergy offered by hybrid constituents such as multiscale (nano- and micro-scale) reinforcement in bio-based resins composed of blends of synthetic and natural resins. Bio-based composites have recently gained much attention due to their low cost, environmental appeal and their potential to compete with synthetic composites. The advantage of multiscale reinforcement is that it offers synergy at various length scales, and when combined with bio-based resins provide stiffness-toughness balance, improved thermal and barrier properties, and increased environmental appeal to the resulting composites. Moreover, these hybrid materials are tailorable in performance and in environmental impact. While the use of different concepts of multiscale reinforcement has been studied for synthetic composites, the study of mukiphase/multiscale reinforcements for developing new types of sustainable materials is limited. The research summarized in this dissertation focused on development of multiscale reinforced bio-based composites and the effort to understand and exploit the synergy of its constituents through experimental characterization and computational simulations. Bio-based composites consisting of petroleum-based resin (unsaturated polyester), natural or bio-resin (epoxidized soybean and linseed oils), natural fibers (industrial hemp), and nanosilicate (nanoclay) inclusions were developed. The work followed the "materials by Mahmoodul Haq design" philosophy by incorporating an integrated experimental and computational approach to strategically explore the design possibilities and limits. Experiments demonstrated that the drawbacks of bio-resin addition, which lowers stiffness, strength and increases permeability, can be counter-balanced through nanoclay reinforcement. Bio-resin addition yields benefits in impact strength and ductility. Conversely, nanoclay enhances stiffness

  10. Hybrid Computational Simulation and Study of Terahertz Pulsed Photoconductive Antennas

    Science.gov (United States)

    Emadi, R.; Barani, N.; Safian, R.; Nezhad, A. Zeidaabadi

    2016-08-01

    A photoconductive antenna (PCA) has been numerically investigated in the terahertz (THz) frequency band based on a hybrid simulation method. This hybrid method utilizes an optoelectronic solver, Silvaco TCAD, and a full-wave electromagnetic solver, CST. The optoelectronic solver is used to find the accurate THz photocurrent by considering realistic material parameters. Performance of photoconductive antennas and temporal behavior of the excited photocurrent for various active region geometries such as bare-gap electrode, interdigitated electrodes, and tip-to-tip rectangular electrodes are investigated. Moreover, investigations have been done on the center of the laser illumination on the substrate, substrate carrier lifetime, and diffusion photocurrent associated with the carriers temperature, to achieve efficient and accurate photocurrent. Finally, using the full-wave electromagnetic solver and the calculated photocurrent obtained from the optoelectronic solver, electromagnetic radiation of the antenna and its associated detected THz signal are calculated and compared with a measurement reference for verification.

  11. Performance Comparison of Hybrid Signed Digit Arithmetic in Efficient Computing

    Directory of Open Access Journals (Sweden)

    VISHAL AWASTHI

    2011-10-01

    Full Text Available In redundant representations, addition can be carried out in a constant time independent of the word length of the operands. Adder forms a fundamental building block in almost majority of VLSI designs. A hybrid adder can add an unsigned number to a signed-digit number and hence their efficient performance greatly determinesthe quality of the final output of the concerned circuit. In this paper we designed and compared the speed of adders by reducing the carry propagation time with the help of combined effect of improved architectures of adders and signed digit representation of number systems. The key idea is to draw out a compromise between execution time of fast adding process and area available which is often very limited. In this paper we also tried to verify the various algorithms of signed digit and hybrid signed digit adders.

  12. Hybrid Computational Simulation and Study of Terahertz Pulsed Photoconductive Antennas

    Science.gov (United States)

    Emadi, R.; Barani, N.; Safian, R.; Nezhad, A. Zeidaabadi

    2016-11-01

    A photoconductive antenna (PCA) has been numerically investigated in the terahertz (THz) frequency band based on a hybrid simulation method. This hybrid method utilizes an optoelectronic solver, Silvaco TCAD, and a full-wave electromagnetic solver, CST. The optoelectronic solver is used to find the accurate THz photocurrent by considering realistic material parameters. Performance of photoconductive antennas and temporal behavior of the excited photocurrent for various active region geometries such as bare-gap electrode, interdigitated electrodes, and tip-to-tip rectangular electrodes are investigated. Moreover, investigations have been done on the center of the laser illumination on the substrate, substrate carrier lifetime, and diffusion photocurrent associated with the carriers temperature, to achieve efficient and accurate photocurrent. Finally, using the full-wave electromagnetic solver and the calculated photocurrent obtained from the optoelectronic solver, electromagnetic radiation of the antenna and its associated detected THz signal are calculated and compared with a measurement reference for verification.

  13. Computational and experimental study of air hybrid engine concepts

    OpenAIRE

    Lee, Cho-Yu

    2011-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University The air hybrid engine absorbs the vehicle kinetic energy during braking, stores it in an air tank in the form of compressed air, and reuses it to start the engine and to propel a vehicle during cruising and acceleration. Capturing, storing and reusing this braking energy to achieve stop-start operation and to give additional power can therefore improve fuel economy, particularly in cities and ...

  14. Hybrid computer techniques for solving partial differential equations

    Science.gov (United States)

    Hammond, J. L., Jr.; Odowd, W. M.

    1971-01-01

    Techniques overcome equipment limitations that restrict other computer techniques in solving trivial cases. The use of curve fitting by quadratic interpolation greatly reduces required digital storage space.

  15. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  16. CAPACITATED LOT SIZING AND SCHEDULING PROBLEMS USING HYBRID GA/TS APPROACHES

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The capacitated lot sizing and scheduling problem that involves in determining the production amounts and release dates for several items over a given planning horizon are given to meet dynamic order demand without incurring backloggings. The problem considering overtime capacity is studied. The mathematical model is presented, and a genetic algorithm (GA) approach is developed to solve the problem. The initial solutions are generated after using heuristic method. Capacity balancing procedure is employed to stipulate the feasibility of the solutions. In addition, a technique based on Tabu search (TS) is inserted into the genetic algorithm to deal with the scheduled overtime and help the convergence of algorithm. Computational simulation is conducted to test the efficiency of the proposed hybrid approach, which turns out to improve both the solution quality and execution speed.

  17. Broadband ground-motion simulation using a hybrid approach

    Science.gov (United States)

    Graves, R.W.; Pitarka, A.

    2010-01-01

    This paper describes refinements to the hybrid broadband ground-motion simulation methodology of Graves and Pitarka (2004), which combines a deterministic approach at low frequencies (f 1 Hz). In our approach, fault rupture is represented kinematically and incorporates spatial heterogeneity in slip, rupture speed, and rise time. The prescribed slip distribution is constrained to follow an inverse wavenumber-squared fall-off and the average rupture speed is set at 80% of the local shear-wave velocity, which is then adjusted such that the rupture propagates faster in regions of high slip and slower in regions of low slip. We use a Kostrov-like slip-rate function having a rise time proportional to the square root of slip, with the average rise time across the entire fault constrained empirically. Recent observations from large surface rupturing earthquakes indicate a reduction of rupture propagation speed and lengthening of rise time in the near surface, which we model by applying a 70% reduction of the rupture speed and increasing the rise time by a factor of 2 in a zone extending from the surface to a depth of 5 km. We demonstrate the fidelity of the technique by modeling the strong-motion recordings from the Imperial Valley, Loma Prieta, Landers, and Northridge earthquakes.

  18. A Hybrid Circular Queue Method for Iterative Stencil Computations on GPUs

    Institute of Scientific and Technical Information of China (English)

    Yang Yang; Hui-Min Cui; Xiao-Bing Feng; Jing-Ling Xue

    2012-01-01

    In this paper,we present a hybrid circular queue method that can significantly boost the performance of stencil computations on GPU by carefully balancing usage of registers and shared-memory.Unlike earlier methods that rely on circular queues predominantly implemented using indirectly addressable shared memory,our hybrid method exploits a new reuse pattern spanning across the multiple time steps in stencil computations so that circular queues can be implemented by both shared memory and registers effectively in a balanced manner.We describe a framework that automatically finds the best placement of data in registers and shared memory in order to maximize the performance of stencil computations.Validation using four different types of stencils on three different GPU platforms shows that our hybrid method achieves speedups up to 2.93X over methods that use circular queues implemented with shared-memory only.

  19. Effective hybrid evolutionary computational algorithms for global optimization and applied to construct prion AGAAAAGA fibril models

    CERN Document Server

    Zhang, Jiapu

    2010-01-01

    Evolutionary algorithms are parallel computing algorithms and simulated annealing algorithm is a sequential computing algorithm. This paper inserts simulated annealing into evolutionary computations and successful developed a hybrid Self-Adaptive Evolutionary Strategy $\\mu+\\lambda$ method and a hybrid Self-Adaptive Classical Evolutionary Programming method. Numerical results on more than 40 benchmark test problems of global optimization show that the hybrid methods presented in this paper are very effective. Lennard-Jones potential energy minimization is another benchmark for testing new global optimization algorithms. It is studied through the amyloid fibril constructions by this paper. To date, there is little molecular structural data available on the AGAAAAGA palindrome in the hydrophobic region (113-120) of prion proteins.This region belongs to the N-terminal unstructured region (1-123) of prion proteins, the structure of which has proved hard to determine using NMR spectroscopy or X-ray crystallography ...

  20. Multi-level and hybrid modelling approaches for systems biology.

    Science.gov (United States)

    Bardini, R; Politano, G; Benso, A; Di Carlo, S

    2017-01-01

    During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.

  1. Hybrid genetic algorithm approach for selective harmonic control

    Energy Technology Data Exchange (ETDEWEB)

    Dahidah, Mohamed S.A. [Faculty of Engineering, Multimedia University, 63100, Jalan Multimedia-Cyberjaya, Selangor (Malaysia); Agelidis, Vassilios G. [School of Electrical and Information Engineering, The University of Sydney, NSW (Australia); Rao, Machavaram V. [Faculty of Engineering and Technology, Multimedia University, 75450, Jalan Ayer Keroh Lama-Melaka (Malaysia)

    2008-02-15

    The paper presents an optimal solution for a selective harmonic elimination pulse width modulated (SHE-PWM) technique suitable for a high power inverter used in constant frequency utility applications. The main challenge of solving the associated non-linear equations, which are transcendental in nature and, therefore, have multiple solutions, is the convergence, and therefore, an initial point selected considerably close to the exact solution is required. The paper discusses an efficient hybrid real coded genetic algorithm (HRCGA) that reduces significantly the computational burden, resulting in fast convergence. An objective function describing a measure of the effectiveness of eliminating selected orders of harmonics while controlling the fundamental, namely a weighted total harmonic distortion (WTHD) is derived, and a comparison of different operating points is reported. It is observed that the method was able to find the optimal solution for a modulation index that is higher than unity. The theoretical considerations reported in this paper are verified through simulation and experimentally on a low power laboratory prototype. (author)

  2. A Hybrid Approach to Spatial Multiplexing in Multiuser MIMO Downlinks

    Directory of Open Access Journals (Sweden)

    Spencer Quentin H

    2004-01-01

    Full Text Available In the downlink of a multiuser multiple-input multiple-output (MIMO communication system, simultaneous transmission to several users requires joint optimization of the transmitted signals. Allowing all users to have multiple antennas adds an additional degree of complexity to the problem. In this paper, we examine the case where a single base station transmits to multiple users using linear processing (beamforming at each of the antenna arrays. We propose generalizations of several previous iterative algorithms for multiuser transmit beamforming that allow multiple antennas and multiple data streams for each user, and that take into account imperfect channel estimates at the transmitter. We then present a new hybrid algorithm that is based on coordinated transmit-receive beamforming, and combines the strengths of nonorthogonal iterative solutions with zero-forcing solutions. The problem of distributing power among the subchannels is solved by using standard bit-loading algorithms combined with the subchannel gains resulting from the zero-forcing solution. The result is a significant performance improvement over equal power distribution. At the same time, the number of iterations required to compute the final solution is reduced.

  3. Hybrid computational phantoms of the male and female newborn patient: NURBS-based whole-body models

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Choonsik [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Lodwick, Daniel [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Hasenauer, Deanna [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Williams, Jonathan L [Department of Radiology, University of Florida, Gainesville, FL 32611 (United States); Lee, Choonik [MD Anderson Cancer Center-Orlando, Orlando, FL 32806 (United States); Bolch, Wesley E [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States)

    2007-07-21

    phantom is performed in three steps: polygonization of the voxel phantom, organ modeling via NURBS surfaces and phantom voxelization. Two 3D graphic tools, 3D-DOCTOR(TM) and Rhinoceros(TM), were utilized to polygonize the newborn voxel phantom and generate NURBS surfaces, while an in-house MATLAB(TM) code was used to voxelize the resulting NURBS model into a final computational phantom ready for use in Monte Carlo radiation transport calculations. A total of 126 anatomical organ and tissue models, including 38 skeletal sites and 31 cartilage sites, were described within the hybrid phantom using either NURBS or polygon surfaces. A male hybrid newborn phantom was constructed following the development of the female phantom through the replacement of female-specific organs with male-specific organs. The outer body contour and internal anatomy of the NURBS-based phantoms were adjusted to match anthropometric and reference newborn data reported by the International Commission on Radiological Protection in their Publication 89. The voxelization process was designed to accurately convert NURBS models to a voxel phantom with minimum volumetric change. A sensitivity study was additionally performed to better understand how the meshing tolerance and voxel resolution would affect volumetric changes between the hybrid-NURBS and hybrid-voxel phantoms. The male and female hybrid-NURBS phantoms were constructed in a manner so that all internal organs approached their ICRP reference masses to within 1%, with the exception of the skin (-6.5% relative error) and brain (-15.4% relative error). Both hybrid-voxel phantoms were constructed with an isotropic voxel resolution of 0.663 mm-equivalent to the ICRP 89 reference thickness of the newborn skin (dermis and epidermis). Hybrid-NURBS phantoms used to create their voxel counterpart retain the non-uniform scalability of stylized phantoms, while maintaining the anatomic realism of segmented voxel phantoms with respect to organ shape, depth and

  4. Hybrid computational phantoms of the male and female newborn patient: NURBS-based whole-body models

    Science.gov (United States)

    Lee, Choonsik; Lodwick, Daniel; Hasenauer, Deanna; Williams, Jonathan L.; Lee, Choonik; Bolch, Wesley E.

    2007-07-01

    phantom is performed in three steps: polygonization of the voxel phantom, organ modeling via NURBS surfaces and phantom voxelization. Two 3D graphic tools, 3D-DOCTOR™ and Rhinoceros™, were utilized to polygonize the newborn voxel phantom and generate NURBS surfaces, while an in-house MATLAB™ code was used to voxelize the resulting NURBS model into a final computational phantom ready for use in Monte Carlo radiation transport calculations. A total of 126 anatomical organ and tissue models, including 38 skeletal sites and 31 cartilage sites, were described within the hybrid phantom using either NURBS or polygon surfaces. A male hybrid newborn phantom was constructed following the development of the female phantom through the replacement of female-specific organs with male-specific organs. The outer body contour and internal anatomy of the NURBS-based phantoms were adjusted to match anthropometric and reference newborn data reported by the International Commission on Radiological Protection in their Publication 89. The voxelization process was designed to accurately convert NURBS models to a voxel phantom with minimum volumetric change. A sensitivity study was additionally performed to better understand how the meshing tolerance and voxel resolution would affect volumetric changes between the hybrid-NURBS and hybrid-voxel phantoms. The male and female hybrid-NURBS phantoms were constructed in a manner so that all internal organs approached their ICRP reference masses to within 1%, with the exception of the skin (-6.5% relative error) and brain (-15.4% relative error). Both hybrid-voxel phantoms were constructed with an isotropic voxel resolution of 0.663 mm—equivalent to the ICRP 89 reference thickness of the newborn skin (dermis and epidermis). Hybrid-NURBS phantoms used to create their voxel counterpart retain the non-uniform scalability of stylized phantoms, while maintaining the anatomic realism of segmented voxel phantoms with respect to organ shape, depth and

  5. Fast and accurate earthquake location within complex medium using a hybrid global-local inversion approach

    Institute of Scientific and Technical Information of China (English)

    Chaoying Bai; Rui Zhao; Stewart Greenhalgh

    2009-01-01

    A novel hybrid approach for earthquake location is proposed which uses a combined coarse global search and fine local inversion with a minimum search routine, plus an examination of the root mean squares (RMS) error distribution. The method exploits the advantages of network ray tracing and robust formulation of the Frechet derivatives to simultaneously update all possible initial source parameters around most local minima (including the global minimum) in the solution space, and finally to determine the likely global solution. Several synthetic examples involving a 3-D complex velocity model and a challenging source-receiver layout are used to demonstrate the capability of the newly-developed method. This new global-local hybrid solution technique not only incorporates the significant benefits of our recently published hypocenter determination procedure for multiple earthquake parameters, but also offers the attractive features of global optimal searching in the RMS travel time error distribution. Unlike the traditional global search method, for example, the Monte Carlo approach, where millions of tests have to be done to find the final global solution, the new method only conducts a matrix inversion type local search but does it multiple times simultaneously throughout the model volume to seek a global solution. The search is aided by inspection of the RMS error distribution. Benchmark tests against two popular approaches, the direct grid search method and the oct-tree important sampling method, indicate that the hybrid global-local inversion yields comparable location accuracy and is not sensitive to modest level of noise data, but more importantly it offers two-order of magnitude speed-up in computational effort. Such an improvement, combined with high accuracy, make it a promising hypocenter determination scheme in earthquake early warning, tsunami early warning, rapid hazard assessment and emergency response after strong earthquake occurrence.

  6. High-fidelity quantum memory using nitrogen-vacancy center ensemble for hybrid quantum computation

    CERN Document Server

    Yang, W L; Hu, Y; Feng, M; Du, J F

    2011-01-01

    We study a hybrid quantum computing system using nitrogen-vacancy center ensemble (NVE) as quantum memory, current-biased Josephson junction (CBJJ) superconducting qubit fabricated in a transmission line resonator (TLR) as quantum computing processor and the microwave photons in TLR as quantum data bus. The storage process is seriously treated by considering all kinds of decoherence mechanisms. Such a hybrid quantum device can also be used to create multi-qubit W states of NVEs through a common CBJJ. The experimental feasibility and challenge are justified using currently available technology.

  7. Hybrid Computational Model for High-Altitude Aeroassist Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort addresses a need for accurate computational models to support aeroassist and entry vehicle system design over a broad range of flight conditions...

  8. Hybrid PSO-MOBA for Profit Maximization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dr. Salu George

    2015-02-01

    Full Text Available Cloud service provider, infrastructure vendor and clients/Cloud user’s are main actors in any cloud enterprise like Amazon web service’s cloud or Google’s cloud. Now these enterprises take care in infrastructure deployment and cloud services management (IaaS/PaaS/SaaS. Cloud user ‘s need to provide correct amount of services needed and characteristic of workload in order to avoid over – provisioning of resources and it’s the important pricing factor. Cloud service provider need to manage the resources and as well as optimize the resources to maximize the profit. To manage the profit we consider the M/M/m queuing model which manages the queue of job and provide average execution time. Resource Scheduling is one of the main concerns in profit maximization for which we take HYBRID PSO-MOBA as it resolves the global convergence problem, faster convergence, less parameter to tune, easier searching in very large problem spaces and locating the right resource. In HYBRID PSO-MOBA we are combining the features of PSO and MOBA to achieve the benefits of both PSO and MOBA and have greater compatibility.

  9. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  10. A Hybrid Approach for Segmentation and Tracking of Myxococcus Xanthus Swarms.

    Science.gov (United States)

    Chen, Jianxu; Alber, Mark S; Chen, Danny Z

    2016-09-01

    Cell segmentation and motion tracking in time-lapse images are fundamental problems in computer vision, and are also crucial for various biomedical studies. Myxococcus xanthus is a type of rod-like cells with highly coordinated motion. The segmentation and tracking of M. xanthus are challenging, because cells may touch tightly and form dense swarms that are difficult to identify individually in an accurate manner. The known cell tracking approaches mainly fall into two frameworks, detection association and model evolution, each having its own advantages and disadvantages. In this paper, we propose a new hybrid framework combining these two frameworks into one and leveraging their complementary advantages. Also, we propose an active contour model based on the Ribbon Snake, which is seamlessly integrated with our hybrid framework. Evaluated by 10 different datasets, our approach achieves considerable improvement over the state-of-the-art cell tracking algorithms on identifying complete cell trajectories, and higher segmentation accuracy than performing segmentation in individual 2D images.

  11. Mobile phone use while driving: a hybrid modeling approach.

    Science.gov (United States)

    Márquez, Luis; Cantillo, Víctor; Arellana, Julián

    2015-05-01

    The analysis of the effects that mobile phone use produces while driving is a topic of great interest for the scientific community. There is consensus that using a mobile phone while driving increases the risk of exposure to traffic accidents. The purpose of this research is to evaluate the drivers' behavior when they decide whether or not to use a mobile phone while driving. For that, a hybrid modeling approach that integrates a choice model with the latent variable "risk perception" was used. It was found that workers and individuals with the highest education level are more prone to use a mobile phone while driving than others. Also, "risk perception" is higher among individuals who have been previously fined and people who have been in an accident or almost been in an accident. It was also found that the tendency to use mobile phones while driving increases when the traffic speed reduces, but it decreases when the fine increases. Even though the urgency of the phone call is the most important explanatory variable in the choice model, the cost of the fine is an important attribute in order to control mobile phone use while driving. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Agricultural Tractor Selection: A Hybrid and Multi-Attribute Approach

    Directory of Open Access Journals (Sweden)

    Jorge L. García-Alcaraz

    2016-02-01

    Full Text Available Usually, agricultural tractor investments are assessed using traditional economic techniques that only involve financial attributes, resulting in reductionist evaluations. However, tractors have qualitative and quantitative attributes that must be simultaneously integrated into the evaluation process. This article reports a hybrid and multi-attribute approach to assessing a set of agricultural tractors based on AHP-TOPSIS. To identify the attributes in the model, a survey including eighteen attributes was given to agricultural machinery salesmen and farmers for determining their importance. The list of attributes was presented to a decision group for a case of study, and their importance was estimated using AHP and integrated into the TOPSIS technique. In this case, one tractor was selected from a set of six alternatives, integrating six attributes in the model: initial cost, annual maintenance cost, liters of diesel per hour, safety of the operator, maintainability and after-sale customer service offered by the supplier. Based on the results obtained, the model can be considered easy to apply and to have good acceptance among farmers and salesmen, as there are no special software requirements for the application.

  13. Enhancement of Hyperspectral Real World Images Using Hybrid Domain Approach

    Directory of Open Access Journals (Sweden)

    Shyam Lal

    2013-04-01

    Full Text Available This paper presents enhancement of hyperspectral real world images using hybrid domain approach. The proposed method consists of three phases: In first phase the discrete wavelet transform is applied and approximation coefficient is selected. In second phase approximation coefficient of discrete wavelet transform of image is process by automatic contrast adjustment technique and in third phase it takes logarithmic of output of second phase and after that adaptive filtering is applied for image enhancement in frequency domain. To judge the superiority of proposed method the image quality parameters such as measure of enhancement (EME and measure of enhancement factor (EMF is evaluated. Therefore, a better value of EME and EMF implies that the visual quality of the enhanced image is good. Simulation results indicates that proposed method provides better results as compared to other state-of-art contrast enhancement algorithms for hyperspectral real world images. The proposed method is efficient and very effective method for contrast enhancement of hyperspectral real world images. This method can also be used in different applications where images are suffering from different contrast problems.

  14. A hybrid ensemble learning approach to star-galaxy classification

    Science.gov (United States)

    Kim, Edward J.; Brunner, Robert J.; Carrasco Kind, Matias

    2015-10-01

    There exist a variety of star-galaxy classification techniques, each with their own strengths and weaknesses. In this paper, we present a novel meta-classification framework that combines and fully exploits different techniques to produce a more robust star-galaxy classification. To demonstrate this hybrid, ensemble approach, we combine a purely morphological classifier, a supervised machine learning method based on random forest, an unsupervised machine learning method based on self-organizing maps, and a hierarchical Bayesian template-fitting method. Using data from the CFHTLenS survey (Canada-France-Hawaii Telescope Lensing Survey), we consider different scenarios: when a high-quality training set is available with spectroscopic labels from DEEP2 (Deep Extragalactic Evolutionary Probe Phase 2 ), SDSS (Sloan Digital Sky Survey), VIPERS (VIMOS Public Extragalactic Redshift Survey), and VVDS (VIMOS VLT Deep Survey), and when the demographics of sources in a low-quality training set do not match the demographics of objects in the test data set. We demonstrate that our Bayesian combination technique improves the overall performance over any individual classification method in these scenarios. Thus, strategies that combine the predictions of different classifiers may prove to be optimal in currently ongoing and forthcoming photometric surveys, such as the Dark Energy Survey and the Large Synoptic Survey Telescope.

  15. Soft computing applications: the advent of hybrid systems

    Science.gov (United States)

    Bonissone, Piero P.

    1998-10-01

    Soft computing is a new field of computer sciences that deals with the integration of problem- solving technologies such as fuzzy logic, probabilistic reasoning, neural networks, and genetic algorithms. Each of these technologies provide us with complementary reasoning and searching methods to solve complex, real-world problems. We will analyze some of the most synergistic combinations of self computing technologies, with an emphasis on the development of smart algorithm-controllers, such as the use of FL to control GAs and NNs parameters. We will also discuss the application of GAs to evolve NNs or tune FL controllers; and the implementation of FL controllers as NNs tuned by backpropagation-type algorithms. We will conclude with a detailed description of a GA-tuned fuzzy controller to implement a train handling control.

  16. Control approach for comfortable power shifting in hybrid transmissions - ML 450 hybrid

    Energy Technology Data Exchange (ETDEWEB)

    Saenger Zetina, Siegfried; Neiss, Konstantin [Daimler AG, Hybrid Development Center, Troy, MI (United States)

    2008-07-01

    The comfortable shifting control in a luxury class vehicle is extremely important, due to competitive automatic transmissions with torque converters; clutch automated manual transmissions and dual clutch transmissions. Hybrid transmissions play a key role in comfort and performance enhancement while at the same time being fuel efficient with the aid of electric machines and battery packs. Here, the alternative to conventional add-on hybrid power head transmissions: the power split hybrid transmission is studied. As a practical example, the Two Mode of the Hybrid Development Center is used within the ML450 Hybrid. For achieving a smooth shifting, there are model based algorithms needed. As objective measure to evaluate the shifting the VDV (Vibration Dose Value) is used. (orig.)

  17. Carbon nanotube reinforced hybrid composites: Computational modeling of environmental fatigue and usability for wind blades

    DEFF Research Database (Denmark)

    Dai, Gaoming; Mishnaevsky, Leon

    2015-01-01

    The potential of advanced carbon/glass hybrid reinforced composites with secondary carbon nanotube reinforcement for wind energy applications is investigated here with the use of computational experiments. Fatigue behavior of hybrid as well as glass and carbon fiber reinforced composites...... with the secondary CNT reinforcements (especially, aligned tubes) present superior fatigue performances than those without reinforcements, also under combined environmental and cyclic mechanical loading. This effect is stronger for carbon composites, than for hybrid and glass composites....... automatically using the Python based code. 3D computational studies of environment and fatigue analyses of multiscale composites with secondary nano-scale reinforcement in different material phases and different CNTs arrangements are carried out systematically in this paper. It was demonstrated that composites...

  18. Hybrid and hierarchical nanoreinforced polymer composites: Computational modelling of structure–properties relationships

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Dai, Gaoming

    2014-01-01

    Hybrid and hierarchical polymer composites represent a promising group of materials for engineering applications. In this paper, computational studies of the strength and damage resistance of hybrid and hierarchical composites are reviewed. The reserves of the composite improvement are explored...... by using computational micromechanical models. It is shown that while glass/carbon fibers hybrid composites clearly demonstrate higher stiffness and lower weight with increasing the carbon content, they can have lower strength as compared with usual glass fiber polymer composites. Secondary...... nanoreinforcement can drastically increase the fatigue lifetime of composites. Especially, composites with the nanoplatelets localized in the fiber/matrix interface layer (fiber sizing) ensure much higher fatigue lifetime than those with the nanoplatelets in the matrix....

  19. Hybrid Computation Model for Intelligent System Design by Synergism of Modified EFC with Neural Network

    OpenAIRE

    2015-01-01

    In recent past, it has been seen in many applications that synergism of computational intelligence techniques outperforms over an individual technique. This paper proposes a new hybrid computation model which is a novel synergism of modified evolutionary fuzzy clustering with associated neural networks. It consists of two modules: fuzzy distribution and neural classifier. In first module, mean patterns are distributed into the number of clusters based on the modified evolutionary fuzzy cluste...

  20. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  1. Analysis on potential approaches to utilize genic male sterility in plant hybrid breeding

    Institute of Scientific and Technical Information of China (English)

    Li Xinqi; Yuan Longping; Xiao Jinhua; Xie fangming

    2005-01-01

    @@ The exploitation of plant heterosis is an effective approach to increasing the food production. The heterotic hybrid varieties in major crops such as rice,cotton, and wheat can show more than 20% yield advantage over best conventional ones under the same cultivation conditions. The difficulties in breeding elite male sterile lines and the inconveniences for commercial hybrid seed production are hampering the development of hybrid crops breeding.

  2. Hybrid computing using a neural network with dynamic external memory.

    Science.gov (United States)

    Graves, Alex; Wayne, Greg; Reynolds, Malcolm; Harley, Tim; Danihelka, Ivo; Grabska-Barwińska, Agnieszka; Colmenarejo, Sergio Gómez; Grefenstette, Edward; Ramalho, Tiago; Agapiou, John; Badia, Adrià Puigdomènech; Hermann, Karl Moritz; Zwols, Yori; Ostrovski, Georg; Cain, Adam; King, Helen; Summerfield, Christopher; Blunsom, Phil; Kavukcuoglu, Koray; Hassabis, Demis

    2016-10-27

    Artificial neural networks are remarkably adept at sensory processing, sequence learning and reinforcement learning, but are limited in their ability to represent variables and data structures and to store data over long timescales, owing to the lack of an external memory. Here we introduce a machine learning model called a differentiable neural computer (DNC), which consists of a neural network that can read from and write to an external memory matrix, analogous to the random-access memory in a conventional computer. Like a conventional computer, it can use its memory to represent and manipulate complex data structures, but, like a neural network, it can learn to do so from data. When trained with supervised learning, we demonstrate that a DNC can successfully answer synthetic questions designed to emulate reasoning and inference problems in natural language. We show that it can learn tasks such as finding the shortest path between specified points and inferring the missing links in randomly generated graphs, and then generalize these tasks to specific graphs such as transport networks and family trees. When trained with reinforcement learning, a DNC can complete a moving blocks puzzle in which changing goals are specified by sequences of symbols. Taken together, our results demonstrate that DNCs have the capacity to solve complex, structured tasks that are inaccessible to neural networks without external read-write memory.

  3. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  4. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  5. Computer science approach to quantum control

    Energy Technology Data Exchange (ETDEWEB)

    Janzing, D.

    2006-07-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  6. A Hybrid Lifetime Extended Directional Approach for WBANs.

    Science.gov (United States)

    Li, Changle; Yuan, Xiaoming; Yang, Li; Song, Yueyang

    2015-11-05

    Wireless Body Area Networks (WBANs) can provide real-time and reliable health monitoring, attributing to the human-centered and sensor interoperability properties. WBANs have become a key component of the ubiquitous eHealth (electronic health) revolution that prospers on the basis of information and communication technologies. The prime consideration in WBAN is how to maximize the network lifetime with battery-powered sensor nodes in energy constraint. Novel solutions in Medium Access Control (MAC) protocols are imperative to satisfy the particular BAN scenario and the need of excellent energy efficiency in healthcare applications. In this paper, we propose a hybrid Lifetime Extended Directional Approach (LEDA) MAC protocol based on IEEE 802.15.6 to reduce energy consumption and prolong network lifetime. The LEDA MAC protocol takes full advantages of directional superiority in energy saving that employs multi-beam directional mode in Carrier Sense Multiple Access/Collision Avoidance (CSMA/CA) and single-beam directional mode in Time Division Multiple Access (TDMA) for alternative in data reservation and transmission according to the traffic varieties. Moreover, the impacts of some inherent problems of directional antennas such as deafness and hidden terminal problem can be decreased owing to that all nodes generate individual beam according to user priorities designated. Furthermore, LEDA MAC employs a Dynamic Polled Allocation Period (DPAP) for burst data transmissions to increase the network reliability and adaptability. Extensive analysis and simulation results show that the proposed LEDA MAC protocol achieves extended network lifetime with improved performance compared with IEEE 802.15.6.

  7. Computational dynamics for robotics systems using a non-strict computational approach

    Science.gov (United States)

    Orin, David E.; Wong, Ho-Cheung; Sadayappan, P.

    1989-01-01

    A Non-Strict computational approach for real-time robotics control computations is proposed. In contrast to the traditional approach to scheduling such computations, based strictly on task dependence relations, the proposed approach relaxes precedence constraints and scheduling is guided instead by the relative sensitivity of the outputs with respect to the various paths in the task graph. An example of the computation of the Inverse Dynamics of a simple inverted pendulum is used to demonstrate the reduction in effective computational latency through use of the Non-Strict approach. A speedup of 5 has been obtained when the processes of the task graph are scheduled to reduce the latency along the crucial path of the computation. While error is introduced by the relaxation of precedence constraints, the Non-Strict approach has a smaller error than the conventional Strict approach for a wide range of input conditions.

  8. Hybrid slime mould-based system for unconventional computing

    Science.gov (United States)

    Berzina, T.; Dimonte, A.; Cifarelli, A.; Erokhin, V.

    2015-04-01

    Physarum polycephalum is considered to be promising for the realization of unconventional computational systems. In this work, we present results of three slime mould-based systems. We have demonstrated the possibility of transporting biocompatible microparticles using attractors, repellents and a DEFLECTOR. The latter is an external tool that enables to conduct Physarum motion. We also present interactions between slime mould and conducting polymers, resulting in a variation of their colour and conductivity. Finally, incorporation of the Physarum into the organic memristive device resulted in a variation of its electrical characteristics due to the slime mould internal activity.

  9. Drell-Yan production at forward rapidities: a hybrid factorization approach

    CERN Document Server

    Schäfer, Wolfgang

    2016-01-01

    We discuss the Drell-Yan production of dileptons at high energies in the forward rapidity region of proton-proton collisions in a hybrid high-energy approach. This approach uses unintegrated gluon distributions in one proton and collinear quark/antiquark distributions in the second proton. We compute various distributions for the case of low-mass dilepton production and compare to the LHCb and ATLAS experimental data on dilepton mass distributions. In distinction to dipole approaches, we include four Drell-Yan structure functions as well as cuts at the level of lepton kinematics. The impact of the interference structure functions is rather small for typical experimental cuts. We find that both side contributions ($g q/\\bar q$ and $q/\\bar q g$) have to be included even for the LHCb rapidity coverage which is in contradiction with what is usually done in the dipole approach. We present results for different unintegrated gluon distributions from the literature. Some of them include saturation effects, but we see...

  10. All-optical quantum computing with a hybrid solid-state processing unit

    CERN Document Server

    Pei, Pei; Li, Chong

    2011-01-01

    We develop an architecture of hybrid quantum solid-state processing unit for universal quantum computing. The architecture allows distant and nonidentical solid-state qubits in distinct physical systems to interact and work collaboratively. All the quantum computing procedures are controlled by optical methods using classical fields and cavity QED. Our methods have prominent advantage of the insensitivity to dissipation process due to the virtual excitation of subsystems. Moreover, the QND measurements and state transfer for the solid-state qubits are proposed. The architecture opens promising perspectives for implementing scalable quantum computation in a broader sense that different solid systems can merge and be integrated into one quantum processor afterwards.

  11. Syntactic and Sentence Feature Based Hybrid Approach for Text Summarization

    Directory of Open Access Journals (Sweden)

    D.Y. Sakhare

    2014-02-01

    Full Text Available Recently, there has been a significant research in automatic text summarization using feature-based techniques in which most of them utilized any one of the soft computing techniques. But, making use of syntactic structure of the sentences for text summarization has not widely applied due to its difficulty of handling it in summarization process. On the other hand, feature-based technique available in the literature showed efficient results in most of the techniques. So, combining syntactic structure into the feature-based techniques is surely smooth the summarization process in a way that the efficiency can be achieved. With the intention of combining two different techniques, we have presented an approach of text summarization that combines feature and syntactic structure of the sentences. Here, two neural networks are trained based on the feature score and the syntactic structure of sentences. Finally, the two neural networks are combined with weighted average to find the sentence score of the sentences. The experimentation is carried out using DUC 2002 dataset for various compression ratios. The results showed that the proposed approach achieved F-measure of 80% for the compression ratio 50 % that proved the better results compared with the existing techniques.

  12. A Hybrid Approach to Solve a Model of Closed-Loop Supply Chain

    Directory of Open Access Journals (Sweden)

    Nafiseh Tokhmehchi

    2015-01-01

    Full Text Available This paper investigates a closed-loop supply chain network, including plants, demand centers, as well as collection centers, and disposal centers. In forward flow, the products are directly sent to demand centers, after being produced by plants, but in the reverse flow, reused products are returned to collection centers and, after investigating, are partly sent to disposal centers and the other part is resent to plants for remanufacturing. The proposed mathematical model is based on mixed-integer programming and helps minimizing the total cost. Total costs include the expenditure of establishing new centers, producing new products, cargo transport in the network, and disposal. The model aims to answer these two questions. (1 What number and in which places the plants, collection centers, and disposal centers will be constructed. (2 What amount of products will be flowing in each segment of the chain, in order to minimize the total cost. Four types of tuned metaheuristic algorithms were used, which are hybrid forms of genetic and firefly algorithms. Finally an adequate number of instances are generated to analyse the behavior of proposed algorithms. Computational results reveal that iterative sequentialization hybrid provides better solution compared with the other approaches in large size.

  13. A Hybrid 3D Learning-and-Interaction-based Segmentation Approach Applied on CT Liver Volumes

    Directory of Open Access Journals (Sweden)

    M. Danciu

    2013-04-01

    Full Text Available Medical volume segmentation in various imaging modalities using real 3D approaches (in contrast to slice-by-slice segmentation represents an actual trend. The increase in the acquisition resolution leads to large amount of data, requiring solutions to reduce the dimensionality of the segmentation problem. In this context, the real-time interaction with the large medical data volume represents another milestone. This paper addresses the twofold problem of the 3D segmentation applied to large data sets and also describes an intuitive neuro-fuzzy trained interaction method. We present a new hybrid semi-supervised 3D segmentation, for liver volumes obtained from computer tomography scans. This is a challenging medical volume segmentation task, due to the acquisition and inter-patient variability of the liver parenchyma. The proposed solution combines a learning-based segmentation stage (employing 3D discrete cosine transform and a probabilistic support vector machine classifier with a post-processing stage (automatic and manual segmentation refinement. Optionally, an optimization of the segmentation can be achieved by level sets, using as initialization the segmentation provided by the learning-based solution. The supervised segmentation is applied on elementary cubes in which the CT volume is decomposed by tilling, thus ensuring a significant reduction of the data to be classified by the support vector machine into liver/not liver. On real volumes, the proposed approach provides good segmentation accuracy, with a significant reduction in the computational complexity.

  14. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  15. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  16. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  17. Computational Models of Spreadsheet Development: Basis for Educational Approaches

    CERN Document Server

    Hodnigg, Karin; Mittermeir, Roland T

    2008-01-01

    Among the multiple causes of high error rates in spreadsheets, lack of proper training and of deep understanding of the computational model upon which spreadsheet computations rest might not be the least issue. The paper addresses this problem by presenting a didactical model focussing on cell interaction, thus exceeding the atomicity of cell computations. The approach is motivated by an investigation how different spreadsheet systems handle certain computational issues implied from moving cells, copy-paste operations, or recursion.

  18. Generator maintenance scheduling in power systems using metaheuristic-based hybrid approaches

    Energy Technology Data Exchange (ETDEWEB)

    Dahal, Keshav P. [School of Informatics, University of Bradford, Bradford (United Kingdom); Chakpitak, Nopasit [College of Arts, Media and Technology, Chiang Mai University, Chiang Mai (Thailand)

    2007-05-15

    The effective maintenance scheduling of power system generators is very important for the economical and reliable operation of a power system. This represents a tough scheduling problem which continues to present a challenge for efficient optimization solution techniques. This paper presents the application of metaheuristic approaches, such as a genetic algorithm (GA), simulated annealing (SA) and their hybrid for generator maintenance scheduling (GMS) in power systems using an integer representation. This paper mainly focuses on the application of GA/SA and GA/SA/heuristic hybrid approaches. GA/SA hybrid uses the probabilistic acceptance criterion of SA within the GA framework. GA/SA/heuristic hybrid combines heuristic approaches within the GA/SA hybrid to seed the initial population. A case study is formulated in this paper as an integer programming problem using a reliability-based objective function and typical problem constraints. The implementation and performance of the metaheuristic approaches and their hybrid for the test case study are discussed. The results obtained are promising and show that the hybrid approaches are less sensitive to the variations of technique parameters and offer an effective alternative for solving the generator maintenance scheduling problem. (author)

  19. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a ...

  20. Molecular electromagnetism a computational chemistry approach

    CERN Document Server

    Sauer, Stephan P A

    2011-01-01

    A textbook for a one-semester course for students in chemistry physics and nanotechnology, this book examines the interaction of molecules with electric and magnetic fields as, for example in light. The book provides the necessary background knowledge for simulating these interactions on computers with modern quantum chemical software.

  1. Hybrid VLSI/QCA Architecture for Computing FFTs

    Science.gov (United States)

    Fijany, Amir; Toomarian, Nikzad; Modarres, Katayoon; Spotnitz, Matthew

    2003-01-01

    A data-processor architecture that would incorporate elements of both conventional very-large-scale integrated (VLSI) circuitry and quantum-dot cellular automata (QCA) has been proposed to enable the highly parallel and systolic computation of fast Fourier transforms (FFTs). The proposed circuit would complement the QCA-based circuits described in several prior NASA Tech Briefs articles, namely Implementing Permutation Matrices by Use of Quantum Dots (NPO-20801), Vol. 25, No. 10 (October 2001), page 42; Compact Interconnection Networks Based on Quantum Dots (NPO-20855) Vol. 27, No. 1 (January 2003), page 32; and Bit-Serial Adder Based on Quantum Dots (NPO-20869), Vol. 27, No. 1 (January 2003), page 35. The cited prior articles described the limitations of very-large-scale integrated (VLSI) circuitry and the major potential advantage afforded by QCA. To recapitulate: In a VLSI circuit, signal paths that are required not to interact with each other must not cross in the same plane. In contrast, for reasons too complex to describe in the limited space available for this article, suitably designed and operated QCAbased signal paths that are required not to interact with each other can nevertheless be allowed to cross each other in the same plane without adverse effect. In principle, this characteristic could be exploited to design compact, coplanar, simple (relative to VLSI) QCA-based networks to implement complex, advanced interconnection schemes.

  2. Secure Data Sharing in Cloud Computing using Hybrid cloud

    Directory of Open Access Journals (Sweden)

    Er. Inderdeep Singh

    2015-06-01

    Full Text Available Cloud computing is fast growing technology that enables the users to store and access their data remotely. Using cloud services users can enjoy the benefits of on-demand cloud applications and data with limited local infrastructure available with them. While accessing the data from cloud, different users may have relationship among them depending on some attributes, and thus sharing of data along with user privacy and data security becomes important to get effective results. Most of the research has been done to secure the data authentication so that user’s don’t lose their private data stored on public cloud. But still data sharing is a significant hurdle to overcome by researchers. Research is going on to provide secure data sharing with enhanced user privacy and data access security. In this paper various research and challenges in this area are discussed in detail. It will definitely help the cloud users to understand the topic and researchers to develop a method to overcome these challenges.

  3. A Systemic Approach Integrating Driving Cycles for the Design of Hybrid Locomotives

    OpenAIRE

    Jaafar, Amine; Sareni, Bruno; Roboam, Xavier

    2013-01-01

    International audience; Driving cycles are essential in hybrid locomotive design by conditioning their size and performance. This paper introduces a new systemic approach to hybrid locomotive design, taking real-world driving cycles into account. The proposed approach first exploits clustering analysis with the aim of identifying classes corresponding to particular sets of driving cycles. Then, a synthesis process of a reduced and representative profile from each class of driving cycles is pr...

  4. A Hybrid Analytical/Simulation Modeling Approach for Planning and Optimizing Mass Tactical Airborne Operations

    Science.gov (United States)

    1995-05-01

    A HYBRID ANALYTICAL/ SIMULATION MODELING APPROACH FOR PLANNING AND OPTIMIZING MASS TACTICAL AIRBORNE OPERATIONS by DAVID DOUGLAS BRIGGS M.S.B.A...COVERED MAY 1995 TECHNICAL REPORT THESIS 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS A HYBRID ANALYTICAL SIMULATION MODELING APPROACH FOR PLANNING AND...are present. Thus, simulation modeling presents itself as an excellent alternate tool for planning because it allows for the modeling of highly complex

  5. Fatigue of hybrid glass/carbon composites: 3D computational studies

    DEFF Research Database (Denmark)

    Dai, Gaoming; Mishnaevsky, Leon

    2014-01-01

    3D computational simulations of fatigue of hybrid carbon/glass fiber reinforced composites is carried out using X-FEM and multifiber unit cell models. A new software code for the automatic generation of unit cell multifiber models of composites with randomly misaligned fibers of various properties...... and geometrical parameters is developed. With the use of this program code and the X-FEM method, systematic investigations of the effect of microstructure of hybrid composites (fraction of carbon versus glass fibers, misalignment, and interface strength) and the loading conditions (tensile versus compression...... cyclic loading effects) on fatigue behavior of the materials are carried out. It was demonstrated that the higher fraction of carbon fibers in hybrid composites is beneficial for the fatigue lifetime of the composites under tension-tension cyclic loading, but might have negative effect on the lifetime...

  6. Decoding of four movement directions using hybrid NIRS-EEG brain-computer interface

    Directory of Open Access Journals (Sweden)

    M. Jawad Khan

    2014-04-01

    Full Text Available The hybrid brain-computer interface (BCI’s multimodal technology enables precision brain-signal classification that can be used in the formulation of control commands. In the present study, an experimental hybrid near-infrared spectroscopy-electroencephalography (NIRS-EEG technique was used to extract and decode four different types of brain signals. The NIRS setup was positioned over the prefrontal brain region, and the EEG over the left and right motor cortex regions. Twelve subjects participating in the experiment were shown four direction symbols, namely, forward, backward, left and right. The control commands for forward and backward movement were estimated by performing arithmetic mental tasks related to oxy-hemoglobin (HbO changes. The left and right directions commands were associated with right and left hand tapping, respectively. The high classification accuracies achieved showed that the four different control signals can be accurately estimated using the hybrid NIRS-EEG technology.

  7. Carbon nanotube reinforced hybrid composites: Computational modeling of environmental fatigue and usability for wind blades

    DEFF Research Database (Denmark)

    Dai, Gaoming; Mishnaevsky, Leon

    2015-01-01

    The potential of advanced carbon/glass hybrid reinforced composites with secondary carbon nanotube reinforcement for wind energy applications is investigated here with the use of computational experiments. Fatigue behavior of hybrid as well as glass and carbon fiber reinforced composites...... with and without secondary CNT reinforcement is simulated using multiscale 3D unit cells. The materials behavior under both mechanical cyclic loading and combined mechanical and environmental loading (with phase properties degraded due to the moisture effects) is studied. The multiscale unit cells are generated...... with the secondary CNT reinforcements (especially, aligned tubes) present superior fatigue performances than those without reinforcements, also under combined environmental and cyclic mechanical loading. This effect is stronger for carbon composites, than for hybrid and glass composites....

  8. Computational Approach To Understanding Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Włodzisław Duch

    2012-01-01

    Full Text Available Every year the prevalence of Autism Spectrum of Disorders (ASD is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD technique is used for the visualization of attractors in the semantic layer of the neural model of reading. Large-scale simulations of brain structures characterized by a high order of complexity requires enormous computational power, especially if biologically motivated neuron models are used to investigate the influence of cellular structure dysfunctions on the network dynamics. Such simulations have to be implemented on computer clusters in a grid-based architectures

  9. A Hybrid Prognostic Approach for Remaining Useful Life Prediction of Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Wen-An Yang

    2016-01-01

    Full Text Available Lithium-ion battery is a core component of many systems such as satellite, spacecraft, and electric vehicles and its failure can lead to reduced capability, downtime, and even catastrophic breakdowns. Remaining useful life (RUL prediction of lithium-ion batteries before the future failure event is extremely crucial for proactive maintenance/safety actions. This study proposes a hybrid prognostic approach that can predict the RUL of degraded lithium-ion batteries using physical laws and data-driven modeling simultaneously. In this hybrid prognostic approach, the relevant vectors obtained with the selective kernel ensemble-based relevance vector machine (RVM learning algorithm are fitted to the physical degradation model, which is then extrapolated to failure threshold for estimating the RUL of the lithium-ion battery of interest. The experimental results indicated that the proposed hybrid prognostic approach can accurately predict the RUL of degraded lithium-ion batteries. Empirical comparisons show that the proposed hybrid prognostic approach using the selective kernel ensemble-based RVM learning algorithm performs better than the hybrid prognostic approaches using the popular learning algorithms of feedforward artificial neural networks (ANNs like the conventional backpropagation (BP algorithm and support vector machines (SVMs. In addition, an investigation is also conducted to identify the effects of RVM learning algorithm on the proposed hybrid prognostic approach.

  10. A Hybrid Computational Model to Explore the Topological Characteristics of Epithelial Tissues.

    Science.gov (United States)

    González-Valverde, Ismael; García Aznar, José Manuel

    2017-03-01

    Epithelial tissues show a particular topology where cells resemble a polygon-like shape, but some biological processes can alter this tissue topology. During cell proliferation, mitotic cell dilation deforms the tissue and modifies the tissue topology. Additionally, cells are reorganized in the epithelial layer and these rearrangements also alter the polygon distribution. We present here a computer-based hybrid framework focused on the simulation of epithelial layer dynamics that combines discrete and continuum numerical models. In this framework, we consider topological and mechanical aspects of the epithelial tissue. Individual cells in the tissue are simulated by an off-lattice agent-based model, which keeps the information of each cell. In addition, we model the cell-cell interaction forces and the cell cycle. Otherwise, we simulate the passive mechanical behaviour of the cell monolayer using a material that approximates the mechanical properties of the cell. This continuum approach is solved by the finite element method, which uses a dynamic mesh generated by the triangulation of cell polygons. Forces generated by cell-cell interaction in the agent-based model are also applied on the finite element mesh. Cell movement in the agent-based model is driven by the displacements obtained from the deformed finite element mesh of the continuum mechanical approach. We successfully compare the results of our simulations with some experiments about the topology of proliferating epithelial tissues in Drosophila. Our framework is able to model the emergent behaviour of the cell monolayer that is due to local cell-cell interactions, which have a direct influence on the dynamics of the epithelial tissue.

  11. Music Genre Classification Systems - A Computational Approach

    OpenAIRE

    Ahrendt, Peter; Hansen, Lars Kai

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  12. Prediction of “Aggregation-Prone” Peptides with Hybrid Classification Approach

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2015-01-01

    Full Text Available Protein aggregation is a biological phenomenon caused by misfolding proteins aggregation and is associated with a wide variety of diseases, such as Alzheimer’s, Parkinson’s, and prion diseases. Many studies indicate that protein aggregation is mediated by short “aggregation-prone” peptide segments. Thus, the prediction of aggregation-prone sites plays a crucial role in the research of drug targets. Compared with the labor-intensive and time-consuming experiment approaches, the computational prediction of aggregation-prone sites is much desirable due to their convenience and high efficiency. In this study, we introduce two computational approaches Aggre_Easy and Aggre_Balance for predicting aggregation residues from the sequence information; here, the protein samples are represented by the composition of k-spaced amino acid pairs (CKSAAP. And we use the hybrid classification approach to predict aggregation-prone residues, which integrates the naïve Bayes classification to reduce the number of features, and two undersampling approaches EasyEnsemble and BalanceCascade to deal with samples imbalance problem. The Aggre_Easy achieves a promising performance with a sensitivity of 79.47%, a specificity of 80.70% and a MCC of 0.42; the sensitivity, specificity, and MCC of Aggre_Balance reach 70.32%, 80.70% and 0.42. Experimental results show that the performance of Aggre_Easy and Aggre_Balance predictor is better than several other state-of-the-art predictors. A user-friendly web server is built for prediction of aggregation-prone which is freely accessible to public at the website.

  13. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  14. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  15. Hybrid-impulsive second order sliding mode control: Lyapunov approach

    NARCIS (Netherlands)

    Shtessel, Y.; Glumineau, A.; Plestan, F.; Weiss, M.

    2013-01-01

    A perturbed nonlinear system of relative degree two controlled by discontinuous-impulsive feedbacks is studied. The hybrid-impulsive terms serve to drive instantaneously the system trajectories to the origin or to its small vicinity. In particular, impulsive-twisting control exhibits an uniform exac

  16. Hybrid-impulsive second order sliding mode control: Lyapunov approach

    NARCIS (Netherlands)

    Shtessel, Y.; Glumineau, A.; Plestan, F.; Weiss, M.

    2013-01-01

    A perturbed nonlinear system of relative degree two controlled by discontinuous-impulsive feedbacks is studied. The hybrid-impulsive terms serve to drive instantaneously the system trajectories to the origin or to its small vicinity. In particular, impulsive-twisting control exhibits an uniform

  17. Hybrid Engine Powered City Car: Fuzzy Controlled Approach

    Science.gov (United States)

    Rahman, Ataur; Mohiuddin, AKM; Hawlader, MNA; Ihsan, Sany

    2017-03-01

    This study describes a fuzzy controlled hybrid engine powered car. The car is powered by the lithium ion battery capacity of 1000 Wh is charged by the 50 cc hybrid engine and power regenerative mode. The engine is operated with lean mixture at 3000 rpm to charge the battery. The regenerative mode that connects with the engine generates electrical power of 500-600 W for the deceleration of car from 90 km/h to 20 km/h. The regenerated electrical power has been used to power the air-conditioning system and to meet the other electrical power. The battery power only used to propel the car. The regenerative power also found charging the battery for longer operation about 40 minutes and more. The design flexibility of this vehicle starts with whole-vehicle integration based on radical light weighting, drag reduction, and accessory efficiency. The energy efficient hybrid engine cut carbon dioxide (CO2) and nitrogen oxides (N2O) emission about 70-80% as the loads on the crankshaft such as cam-follower and its associated rotating components are replaced by electromagnetic systems, and the flywheel, alternator and starter motor are replaced by a motor generator. The vehicle was tested and found that it was able to travel 70 km/litre with the power of hybrid engine.

  18. Using a Hybrid Approach for a Leadership Cohort Program

    Science.gov (United States)

    Norman, Maxine A.

    2013-01-01

    Because information technology continues to change rapidly, Extension is challenged with learning and using technology appropriately. We assert Extension cannot shy away from the challenges but must embrace technology because audiences and external forces demand it. A hybrid, or blended, format of a leadership cohort program was offered to public…

  19. Hybrid empirical--theoretical approach to modeling uranium adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Hull, Larry C.; Grossman, Christopher; Fjeld, Robert A.; Coates, John T.; Elzerman, Alan W

    2004-05-01

    An estimated 330 metric tons of U are buried in the radioactive waste Subsurface Disposal Area (SDA) at the Idaho National Engineering and Environmental Laboratory (INEEL). An assessment of U transport parameters is being performed to decrease the uncertainty in risk and dose predictions derived from computer simulations of U fate and transport to the underlying Snake River Plain Aquifer. Uranium adsorption isotherms were measured for 14 sediment samples collected from sedimentary interbeds underlying the SDA. The adsorption data were fit with a Freundlich isotherm. The Freundlich n parameter is statistically identical for all 14 sediment samples and the Freundlich K{sub f} parameter is correlated to sediment surface area (r{sup 2}=0.80). These findings suggest an efficient approach to material characterization and implementation of a spatially variable reactive transport model that requires only the measurement of sediment surface area. To expand the potential applicability of the measured isotherms, a model is derived from the empirical observations by incorporating concepts from surface complexation theory to account for the effects of solution chemistry. The resulting model is then used to predict the range of adsorption conditions to be expected in the vadose zone at the SDA based on the range in measured pore water chemistry. Adsorption in the deep vadose zone is predicted to be stronger than in near-surface sediments because the total dissolved carbonate decreases with depth.

  20. Transuranic Hybrid Materials: Crystallographic and Computational Metrics of Supramolecular Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Surbella, Robert G. [Department; Ducati, Lucas C. [Department; Pellegrini, Kristi L. [Pacific Northwest National Laboratory, 902 Battelle Boulevard, Richland, Washington 99354, United States; McNamara, Bruce K. [Pacific Northwest National Laboratory, 902 Battelle Boulevard, Richland, Washington 99354, United States; Autschbach, Jochen [Department; Schwantes, Jon M. [Pacific Northwest National Laboratory, 902 Battelle Boulevard, Richland, Washington 99354, United States; Cahill, Christopher L. [Department

    2017-07-26

    A family of twelve supramolecular [AnO2Cl4]2- (An = U, Np, Pu) containing compounds assembled via hydrogen and halogen bonds donated by substituted 4-X-pyridinium cations (X = H, Cl, Br, I) is reported. These materials were prepared from a room-temperature synthesis wherein crystallization of unhydrolyzed and valence pure [An(VI)O2Cl4]2- (An = U, Np, Pu) tectons are the norm. We present a hierarchy of assembly criteria based on crystallographic observations, and subsequently quantify the strengths of the non-covalent interactions using Kohn-Sham density functional calculations. We provide, for the first time, a detailed description of the electrostatic potentials (ESPs) of the actinyl tetrahalide dianions and reconcile crystallographically observed structural motifs and non-covalent interaction (NCI) acceptor-donor pairings. Our findings indicate that the average electrostatic potential across the halogen ligands (the acceptors) changes by only ~2 kJ mol-1 across the AnO22+ series, indicating the magnitude of the potential is independent of the metal center. The role of the cation is therefore critical in directing structural motifs and dictating the resulting hydrogen and halogen bond strengths, the former being stronger due to the positive charge centralized on the pyridyl nitrogen N-H+. Subsequent analyses using the Quantum theory of atoms in molecules (QTAIM) and natural bond orbital (NBO) approaches support this conclusion and highlight the structure directing role of the cations. Whereas one can infer that the 2 Columbic attraction is the driver for assembly, the contribution of the non-covalent interaction is to direct the molecular-level arrangement (or disposition) of the tectons.

  1. A Game-Theoretic approach to Fault Diagnosis of Hybrid Systems

    Directory of Open Access Journals (Sweden)

    Davide Bresolin

    2011-06-01

    Full Text Available Physical systems can fail. For this reason the problem of identifying and reacting to faults has received a large attention in the control and computer science communities. In this paper we study the fault diagnosis problem for hybrid systems from a game-theoretical point of view. A hybrid system is a system mixing continuous and discrete behaviours that cannot be faithfully modeled neither by using a formalism with continuous dynamics only nor by a formalism including only discrete dynamics. We use the well known framework of hybrid automata for modeling hybrid systems, and we define a Fault Diagnosis Game on them, using two players: the environment and the diagnoser. The environment controls the evolution of the system and chooses whether and when a fault occurs. The diagnoser observes the external behaviour of the system and announces whether a fault has occurred or not. Existence of a winning strategy for the diagnoser implies that faults can be detected correctly, while computing such a winning strategy corresponds to implement a diagnoser for the system. We will show how to determine the existence of a winning strategy, and how to compute it, for some decidable classes of hybrid automata like o-minimal hybrid automata.

  2. Introduction and evaluation of a novel hybrid brattice for improved dust control in underground mining faces:A computational study

    Institute of Scientific and Technical Information of China (English)

    Kurnia Jundika C.; Sasmito Agus P.; Hassani Ferri P.; Mujumdar Arun S.

    2015-01-01

    A proper control and management of dust dispersion is essential to ensure safe and productive under-ground working environment. Brattice installation to direct the flow from main shaft to the mining face was found to be the most effective method to disperse dust particle away from the mining face. However, it limits the movement and disturbs the flexibility of the mining fleets and operators at the tunnel. This study proposes a hybrid brattice system-a combination of a physical brattice together with suitable and flexible directed and located air curtains-to mitigate dust dispersion from the mining face and reduce dust concentration to a safe level for the working operators. A validated three-dimensional computa-tional fluid dynamic model utilizing Eulerian–Lagrangian approach is employed to track the dispersion of dust particle. Several possible hybrid brattice scenarios are evaluated with the objective to improve dust management in underground mine. The results suggest that implementation of hybrid brattice is beneficial for the mining operation:up to three times lower dust concentration is achieved as compared to that of the physical brattice without air curtain.

  3. Hybrid grammar-based approach to nonlinear dynamical system identification from biological time series

    Science.gov (United States)

    McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.

    2006-02-01

    We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.

  4. Acoustic gravity waves: A computational approach

    Science.gov (United States)

    Hariharan, S. I.; Dutt, P. K.

    1987-01-01

    This paper discusses numerical solutions of a hyperbolic initial boundary value problem that arises from acoustic wave propagation in the atmosphere. Field equations are derived from the atmospheric fluid flow governed by the Euler equations. The resulting original problem is nonlinear. A first order linearized version of the problem is used for computational purposes. The main difficulty in the problem as with any open boundary problem is in obtaining stable boundary conditions. Approximate boundary conditions are derived and shown to be stable. Numerical results are presented to verify the effectiveness of these boundary conditions.

  5. Global computational algebraic topology approach for diffusion

    Science.gov (United States)

    Auclair-Fortier, Marie-Flavie; Ziou, Djemel; Allili, Madjid

    2004-05-01

    One physical process involved in many computer vision problems is the heat diffusion process. Such Partial differential equations are continuous and have to be discretized by some techniques, mostly mathematical processes like finite differences or finite elements. The continuous domain is subdivided into sub-domains in which there is only one value. The diffusion equation comes from the energy conservation then it is valid on a whole domain. We use the global equation instead of discretize the PDE obtained by a limit process on this global equation. To encode these physical global values over pixels of different dimensions, we use a computational algebraic topology (CAT)-based image model. This model has been proposed by Ziou and Allili and used for the deformation of curves and optical flow. It introduces the image support as a decomposition in terms of points, edges, surfaces, volumes, etc. Images of any dimensions can then be handled. After decomposing the physical principles of the heat transfer into basic laws, we recall the CAT-based image model and use it to encode the basic laws. We then present experimental results for nonlinear graylevel diffusion for denoising, ensuring thin features preservation.

  6. A complex network approach to cloud computing

    CERN Document Server

    Travieso, Gonzalo; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2015-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the users' tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlain by Erdos-Renyi and Barabasi-Albert topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of two indices: the cost of communication between the user and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter index, the ER topology provides better performance than the BA case for smaller average degrees and opposite behavior for larger average degrees. With respect to the cost, smaller values are found in the BA ...

  7. Computational approaches to homogeneous gold catalysis.

    Science.gov (United States)

    Faza, Olalla Nieto; López, Carlos Silva

    2015-01-01

    Homogenous gold catalysis has been exploding for the last decade at an outstanding pace. The best described reactivity of Au(I) and Au(III) species is based on gold's properties as a soft Lewis acid, but new reactivity patterns have recently emerged which further expand the range of transformations achievable using gold catalysis, with examples of dual gold activation, hydrogenation reactions, or Au(I)/Au(III) catalytic cycles.In this scenario, to develop fully all these new possibilities, the use of computational tools to understand at an atomistic level of detail the complete role of gold as a catalyst is unavoidable. In this work we aim to provide a comprehensive review of the available benchmark works on methodological options to study homogenous gold catalysis in the hope that this effort can help guide the choice of method in future mechanistic studies involving gold complexes. This is relevant because a representative number of current mechanistic studies still use methods which have been reported as inappropriate and dangerously inaccurate for this chemistry.Together with this, we describe a number of recent mechanistic studies where computational chemistry has provided relevant insights into non-conventional reaction paths, unexpected selectivities or novel reactivity, which illustrate the complexity behind gold-mediated organic chemistry.

  8. Evaluation of hybrid fusion 2+ approach for providing air-to-air situational awareness and threat assessment

    Science.gov (United States)

    Lee, Kangjin David; Wiesenfeld, Eric; Colony, Mike

    2006-05-01

    Modern combat aircraft pilots increasingly rely on high-level fusion models (JDL Levels 2/3) to provide real-time engagement support in hostile situations. These models provide both Situational Awareness (SA) and Threat Assessment (TA) based on data and the relationships between the data. This information represents two distinct classes of uncertainty: vagueness and ambiguity. To address the needs associated with modeling both of these types of data uncertainty, an innovative hybrid approach was recently introduced, combining probability theory and possibility theory into a unified computational framework. The goal of this research is to qualitatively and quantitatively address the advantages and disadvantages of adopting this hybrid framework as well as identifying instances in which the combined model outperforms or is more appropriate than more classical inference approaches. To accomplish this task, domain specific models will be developed using different theoretical approaches and conventions, and then evaluated in comparison to situational ground truth to determine their accuracy and fidelity. Additionally, the performance tradeoff between accuracy and complexity will be examined in terms of computational cost to determine both the advantages and disadvantages of each approach.

  9. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    OpenAIRE

    Lukas Falat; Dusan Marcek; Maria Durisova

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the sug...

  10. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    Science.gov (United States)

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  11. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  12. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  13. A polyhedral approach to computing border bases

    CERN Document Server

    Braun, Gábor

    2009-01-01

    Border bases can be considered to be the natural extension of Gr\\"obner bases that have several advantages. Unfortunately, to date the classical border basis algorithm relies on (degree-compatible) term orderings and implicitly on reduced Gr\\"obner bases. We adapt the classical border basis algorithm to allow for calculating border bases for arbitrary degree-compatible order ideals, which is \\emph{independent} from term orderings. Moreover, the algorithm also supports calculating degree-compatible order ideals with \\emph{preference} on contained elements, even though finding a preferred order ideal is NP-hard. Effectively we retain degree-compatibility only to successively extend our computation degree-by-degree. The adaptation is based on our polyhedral characterization: order ideals that support a border basis correspond one-to-one to integral points of the order ideal polytope. This establishes a crucial connection between the ideal and the combinatorial structure of the associated factor spaces.

  14. Computationally efficient double hybrid density functional theory using dual basis methods

    CERN Document Server

    Byrd, Jason N

    2015-01-01

    We examine the application of the recently developed dual basis methods of Head-Gordon and co-workers to double hybrid density functional computations. Using the B2-PLYP, B2GP-PLYP, DSD-BLYP and DSD-PBEP86 density functionals, we assess the performance of dual basis methods for the calculation of conformational energy changes in C$_4$-C$_7$ alkanes and for the S22 set of noncovalent interaction energies. The dual basis methods, combined with resolution-of-the-identity second-order M{\\o}ller-Plesset theory, are shown to give results in excellent agreement with conventional methods at a much reduced computational cost.

  15. A hybrid system approach to airspeed, angle of attack and sideslip estimation in Unmanned Aerial Vehicles

    KAUST Repository

    Shaqura, Mohammad

    2015-06-01

    Fixed wing Unmanned Aerial Vehicles (UAVs) are an increasingly common sensing platform, owing to their key advantages: speed, endurance and ability to explore remote areas. While these platforms are highly efficient, they cannot easily be equipped with air data sensors commonly found on their larger scale manned counterparts. Indeed, such sensors are bulky, expensive and severely reduce the payload capability of the UAVs. In consequence, UAV controllers (humans or autopilots) have little information on the actual mode of operation of the wing (normal, stalled, spin) which can cause catastrophic losses of control when flying in turbulent weather conditions. In this article, we propose a real-time air parameter estimation scheme that can run on commercial, low power autopilots in real-time. The computational method is based on a hybrid decomposition of the modes of operation of the UAV. A Bayesian approach is considered for estimation, in which the estimated airspeed, angle of attack and sideslip are described statistically. An implementation on a UAV is presented, and the performance and computational efficiency of this method are validated using hardware in the loop (HIL) simulation and experimental flight data and compared with classical Extended Kalman Filter estimation. Our benchmark tests shows that this method is faster than EKF by up to two orders of magnitude. © 2015 IEEE.

  16. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach.

    Science.gov (United States)

    Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai

    2017-01-01

    Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99

  17. Biologically motivated computationally intensive approaches to image pattern recognition

    NARCIS (Netherlands)

    Petkov, Nikolay

    1995-01-01

    This paper presents some of the research activities of the research group in vision as a grand challenge problem whose solution is estimated to need the power of Tflop/s computers and for which computational methods have yet to be developed. The concerned approaches are biologically motivated, in th

  18. An Approach to Dynamic Provisioning of Social and Computational Services

    NARCIS (Netherlands)

    Bonino da Silva Santos, Luiz Olavo; Sorathia, Vikram; Ferreira Pires, Luis; Sinderen, van Marten

    2010-01-01

    Service-Oriented Computing (SOC) builds upon the intuitive notion of service already known and used in our society for a long time. SOC-related approaches are based on computer-executable functional units that often represent automation of services that exist at the social level, i.e., services at t

  19. Facile approach to prepare Pt decorated SWNT/graphene hybrid catalytic ink

    Energy Technology Data Exchange (ETDEWEB)

    Mayavan, Sundar, E-mail: sundarmayavan@cecri.res.in [Centre for Innovation in Energy Research, CSIR–Central Electrochemical Research Institute, Karaikudi 630006, Tamil Nadu (India); Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 305-701 (Korea, Republic of); Mandalam, Aditya; Balasubramanian, M. [Centre for Innovation in Energy Research, CSIR–Central Electrochemical Research Institute, Karaikudi 630006, Tamil Nadu (India); Sim, Jun-Bo [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 305-701 (Korea, Republic of); Choi, Sung-Min, E-mail: sungmin@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 305-701 (Korea, Republic of)

    2015-07-15

    Highlights: • Pt NPs were in situ synthesized onto CNT–graphene support in aqueous solution. • The as-prepared material was used directly as a catalyst ink without further treatment. • Catalyst ink is active toward methanol oxidation. • This approach realizes both scalable and greener production of hybrid catalysts. - Abstract: Platinum nanoparticles were in situ synthesized onto hybrid support involving graphene and single walled carbon nanotube in aqueous solution. We investigate the reduction of graphene oxide, and platinum nanoparticle functionalization on hybrid support by X-ray photoelectron spectroscopy, Raman spectroscopy, X-ray diffraction, scanning electron microscopy and transmission electron microscopy. The as-prepared platinum on hybrid support was used directly as a catalyst ink without further treatment and is active toward methanol oxidation. This work realizes both scalable and greener production of highly efficient hybrid catalysts, and would be valuable for practical applications of graphene based fuel cell catalysts.

  20. Strongly Interacting Matter at Finite Chemical Potential: Hybrid Model Approach

    Science.gov (United States)

    Srivastava, P. K.; Singh, C. P.

    2013-06-01

    Search for a proper and realistic equation of state (EOS) for strongly interacting matter used in the study of the QCD phase diagram still appears as a challenging problem. Recently, we constructed a hybrid model description for the quark-gluon plasma (QGP) as well as hadron gas (HG) phases where we used an excluded volume model for HG and a thermodynamically consistent quasiparticle model for the QGP phase. The hybrid model suitably describes the recent lattice results of various thermodynamical as well as transport properties of the QCD matter at zero baryon chemical potential (μB). In this paper, we extend our investigations further in obtaining the properties of QCD matter at finite value of μB and compare our results with the most recent results of lattice QCD calculation.

  1. Active diagnosis of hybrid systems - A model predictive approach

    OpenAIRE

    2009-01-01

    A method for active diagnosis of hybrid systems is proposed. The main idea is to predict the future output of both normal and faulty model of the system; then at each time step an optimization problem is solved with the objective of maximizing the difference between the predicted normal and faulty outputs constrained by tolerable performance requirements. As in standard model predictive control, the first element of the optimal input is applied to the system and the whole procedure is repeate...

  2. Resource-Efficient, Hierarchical Auto-Tuning of a Hybrid Lattice Boltzmann Computation on the Cray XT4

    Energy Technology Data Exchange (ETDEWEB)

    Computational Research Division, Lawrence Berkeley National Laboratory; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Berkeley; Williams, Samuel; Carter, Jonathan; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2009-05-04

    We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to conventional auto-tuning at the local SMP node, we tune at the message-passing level to determine the optimal aspect ratio as well as the correct balance between MPI tasks and threads per MPI task. Our study presents a detailed performance analysis when moving along an isocurve of constant hardware usage: fixed total memory, total cores, and total nodes. Overall, our work points to approaches for improving intra- and inter-node efficiency on large-scale multicore systems for demanding scientific applications.

  3. Identifying New Candidate Genes and Chemicals Related to Prostate Cancer Using a Hybrid Network and Shortest Path Approach

    Science.gov (United States)

    Yuan, Fei; Zhou, You; Wang, Meng; Yang, Jing; Wu, Kai; Lu, Changhong; Kong, Xiangyin; Cai, Yu-Dong

    2015-01-01

    Prostate cancer is a type of cancer that occurs in the male prostate, a gland in the male reproductive system. Because prostate cancer cells may spread to other parts of the body and can influence human reproduction, understanding the mechanisms underlying this disease is critical for designing effective treatments. The identification of as many genes and chemicals related to prostate cancer as possible will enhance our understanding of this disease. In this study, we proposed a computational method to identify new candidate genes and chemicals based on currently known genes and chemicals related to prostate cancer by applying a shortest path approach in a hybrid network. The hybrid network was constructed according to information concerning chemical-chemical interactions, chemical-protein interactions, and protein-protein interactions. Many of the obtained genes and chemicals are associated with prostate cancer. PMID:26504486

  4. A 3D hybrid grid generation technique and a multigrid/parallel algorithm based on anisotropic agglomeration approach

    Institute of Scientific and Technical Information of China (English)

    Zhang Laiping; Zhao Zhong; Chang Xinghua; He Xin

    2013-01-01

    A hybrid grid generation technique and a multigrid/parallel algorithm are presented in this paper for turbulence flow simulations over three-dimensional (3D) complex geometries.The hybrid grid generation technique is based on an agglomeration method of anisotropic tetrahedrons.Firstly,the complex computational domain is covered by pure tetrahedral grids,in which anisotropic tetrahedrons are adopted to discrete the boundary layer and isotropic tetrahedrons in the outer field.Then,the anisotropic tetrahedrons in the boundary layer are agglomerated to generate prismatic grids.The agglomeration method can improve the grid quality in boundary layer and reduce the grid quantity to enhance the numerical accuracy and efficiency.In order to accelerate the convergence history,a multigrid/parallel algorithm is developed also based on anisotropic agglomeration approach.The numerical results demonstrate the excellent accelerating capability of this multigrid method.

  5. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    V Vimalan; N Chandrakumar

    2008-01-01

    We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic operations. We demonstrate experimentally our methodology for both homonuclear and heteronuclear spin ensembles. On model two-spin systems, the initialization time of one of our sequences is three-fourths (in the heteronuclear case) or one-fourth (in the homonuclear case), of the typical pulsed free precession sequences, attaining the same initialization efficiency. We have implemented the logical SWAP operation in homonuclear AMX spin systems using selective isotropic mixing, reducing the duration taken to a third compared to the standard re-focused INEPT-type sequence. We introduce the 1D version for readout of the rotating frame SWAP operation, in an attempt to reduce readout time. We further demonstrate the Hadamard mode of 1D SWAP, which offers 2N-fold reduction in experiment time for a system with -working bits, attaining the same sensitivity as the standard 1D version.

  6. Delay Computation Using Fuzzy Logic Approach

    Directory of Open Access Journals (Sweden)

    Ramasesh G. R.

    2012-10-01

    Full Text Available The paper presents practical application of fuzzy sets and system theory in predicting delay, with reasonable accuracy, a wide range of factors pertaining to construction projects. In this paper we shall use fuzzy logic to predict delays on account of Delayed supplies and Labor shortage. It is observed that the project scheduling software use either deterministic method or probabilistic method for computation of schedule durations, delays, lags and other parameters. In other words, these methods use only quantitative inputs leaving-out the qualitative aspects associated with individual activity of work. The qualitative aspect viz., the expertise of the mason or the lack of experience can have a significant impact on the assessed duration. Such qualitative aspects do not find adequate representation in the Project Scheduling software. A realistic project is considered for which a PERT chart has been prepared using showing all the major activities in reasonable detail. This project has been periodically updated until its completion. It is observed that some of the activities are delayed due to extraneous factors resulting in the overall delay of the project. The software has the capability to calculate the overall delay through CPM (Critical Path Method when each of the activity-delays is reported. We shall now demonstrate that by using fuzzy logic, these delays could have been predicted well in advance.

  7. A low-cost EEG system-based hybrid brain-computer interface for humanoid robot navigation and recognition.

    Science.gov (United States)

    Choi, Bongjae; Jo, Sungho

    2013-01-01

    This paper describes a hybrid brain-computer interface (BCI) technique that combines the P300 potential, the steady state visually evoked potential (SSVEP), and event related de-synchronization (ERD) to solve a complicated multi-task problem consisting of humanoid robot navigation and control along with object recognition using a low-cost BCI system. Our approach enables subjects to control the navigation and exploration of a humanoid robot and recognize a desired object among candidates. This study aims to demonstrate the possibility of a hybrid BCI based on a low-cost system for a realistic and complex task. It also shows that the use of a simple image processing technique, combined with BCI, can further aid in making these complex tasks simpler. An experimental scenario is proposed in which a subject remotely controls a humanoid robot in a properly sized maze. The subject sees what the surrogate robot sees through visual feedback and can navigate the surrogate robot. While navigating, the robot encounters objects located in the maze. It then recognizes if the encountered object is of interest to the subject. The subject communicates with the robot through SSVEP and ERD-based BCIs to navigate and explore with the robot, and P300-based BCI to allow the surrogate robot recognize their favorites. Using several evaluation metrics, the performances of five subjects navigating the robot were quite comparable to manual keyboard control. During object recognition mode, favorite objects were successfully selected from two to four choices. Subjects conducted humanoid navigation and recognition tasks as if they embodied the robot. Analysis of the data supports the potential usefulness of the proposed hybrid BCI system for extended applications. This work presents an important implication for the future work that a hybridization of simple BCI protocols provide extended controllability to carry out complicated tasks even with a low-cost system.

  8. A hybrid three-class brain-computer interface system utilizing SSSEPs and transient ERPs

    Science.gov (United States)

    Breitwieser, Christian; Pokorny, Christoph; Müller-Putz, Gernot R.

    2016-12-01

    Objective. This paper investigates the fusion of steady-state somatosensory evoked potentials (SSSEPs) and transient event-related potentials (tERPs), evoked through tactile simulation on the left and right-hand fingertips, in a three-class EEG based hybrid brain-computer interface. It was hypothesized, that fusing the input signals leads to higher classification rates than classifying tERP and SSSEP individually. Approach. Fourteen subjects participated in the studies, consisting of a screening paradigm to determine person dependent resonance-like frequencies and a subsequent online paradigm. The whole setup of the BCI system was based on open interfaces, following suggestions for a common implementation platform. During the online experiment, subjects were instructed to focus their attention on the stimulated fingertips as indicated by a visual cue. The recorded data were classified during runtime using a multi-class shrinkage LDA classifier and the outputs were fused together applying a posterior probability based fusion. Data were further analyzed offline, involving a combined classification of SSSEP and tERP features as a second fusion principle. The final results were tested for statistical significance applying a repeated measures ANOVA. Main results. A significant classification increase was achieved when fusing the results with a combined classification compared to performing an individual classification. Furthermore, the SSSEP classifier was significantly better in detecting a non-control state, whereas the tERP classifier was significantly better in detecting control states. Subjects who had a higher relative band power increase during the screening session also achieved significantly higher classification results than subjects with lower relative band power increase. Significance. It could be shown that utilizing SSSEP and tERP for hBCIs increases the classification accuracy and also that tERP and SSSEP are not classifying control- and non

  9. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  10. 16th International Conference on Hybrid Intelligent Systems and the 8th World Congress on Nature and Biologically Inspired Computing

    CERN Document Server

    Haqiq, Abdelkrim; Alimi, Adel; Mezzour, Ghita; Rokbani, Nizar; Muda, Azah

    2017-01-01

    This book presents the latest research in hybrid intelligent systems. It includes 57 carefully selected papers from the 16th International Conference on Hybrid Intelligent Systems (HIS 2016) and the 8th World Congress on Nature and Biologically Inspired Computing (NaBIC 2016), held on November 21–23, 2016 in Marrakech, Morocco. HIS - NaBIC 2016 was jointly organized by the Machine Intelligence Research Labs (MIR Labs), USA; Hassan 1st University, Settat, Morocco and University of Sfax, Tunisia. Hybridization of intelligent systems is a promising research field in modern artificial/computational intelligence and is concerned with the development of the next generation of intelligent systems. The conference’s main aim is to inspire further exploration of the intriguing potential of hybrid intelligent systems and bio-inspired computing. As such, the book is a valuable resource for practicing engineers /scientists and researchers working in the field of computational intelligence and artificial intelligence.

  11. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    Science.gov (United States)

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  12. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  13. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2016-02-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  14. Aluminium in Biological Environments: A Computational Approach

    Science.gov (United States)

    Mujika, Jon I; Rezabal, Elixabete; Mercero, Jose M; Ruipérez, Fernando; Costa, Dominique; Ugalde, Jesus M; Lopez, Xabier

    2014-01-01

    The increased availability of aluminium in biological environments, due to human intervention in the last century, raises concerns on the effects that this so far “excluded from biology” metal might have on living organisms. Consequently, the bioinorganic chemistry of aluminium has emerged as a very active field of research. This review will focus on our contributions to this field, based on computational studies that can yield an understanding of the aluminum biochemistry at a molecular level. Aluminium can interact and be stabilized in biological environments by complexing with both low molecular mass chelants and high molecular mass peptides. The speciation of the metal is, nonetheless, dictated by the hydrolytic species dominant in each case and which vary according to the pH condition of the medium. In blood, citrate and serum transferrin are identified as the main low molecular mass and high molecular mass molecules interacting with aluminium. The complexation of aluminium to citrate and the subsequent changes exerted on the deprotonation pathways of its tritable groups will be discussed along with the mechanisms for the intake and release of aluminium in serum transferrin at two pH conditions, physiological neutral and endosomatic acidic. Aluminium can substitute other metals, in particular magnesium, in protein buried sites and trigger conformational disorder and alteration of the protonation states of the protein's sidechains. A detailed account of the interaction of aluminium with proteic sidechains will be given. Finally, it will be described how alumnium can exert oxidative stress by stabilizing superoxide radicals either as mononuclear aluminium or clustered in boehmite. The possibility of promotion of Fenton reaction, and production of hydroxyl radicals will also be discussed. PMID:24757505

  15. Computer-Aided Design of Drugs on Emerging Hybrid High Performance Computers

    Science.gov (United States)

    2013-09-01

    Clustering using MapReduce , Workshop on Trends in High-Performance Distributed Computing, Vrije Universiteit, Amsterdam, NL. (Invited Talk) [25] February...and middleware packages for polarizable force fields on multi-core and GPU systems, supported by the MapReduce paradigm. NSF MRI #0922657, $451,051...High-throughput Molecular Datasets for Scalable Clustering using MapReduce , Workshop on Trends in High-Performance Distributed Computing, Vrije

  16. Hybrid Brain–Computer Interface Techniques for Improved Classification Accuracy and Increased Number of Commands: A Review

    Directory of Open Access Journals (Sweden)

    Keum-Shik Hong

    2017-07-01

    Full Text Available In this article, non-invasive hybrid brain–computer interface (hBCI technologies for improving classification accuracy and increasing the number of commands are reviewed. Hybridization combining more than two modalities is a new trend in brain imaging and prosthesis control. Electroencephalography (EEG, due to its easy use and fast temporal resolution, is most widely utilized in combination with other brain/non-brain signal acquisition modalities, for instance, functional near infrared spectroscopy (fNIRS, electromyography (EMG, electrooculography (EOG, and eye tracker. Three main purposes of hybridization are to increase the number of control commands, improve classification accuracy and reduce the signal detection time. Currently, such combinations of EEG + fNIRS and EEG + EOG are most commonly employed. Four principal components (i.e., hardware, paradigm, classifiers, and features relevant to accuracy improvement are discussed. In the case of brain signals, motor imagination/movement tasks are combined with cognitive tasks to increase active brain–computer interface (BCI accuracy. Active and reactive tasks sometimes are combined: motor imagination with steady-state evoked visual potentials (SSVEP and motor imagination with P300. In the case of reactive tasks, SSVEP is most widely combined with P300 to increase the number of commands. Passive BCIs, however, are rare. After discussing the hardware and strategies involved in the development of hBCI, the second part examines the approaches used to increase the number of control commands and to enhance classification accuracy. The future prospects and the extension of hBCI in real-time applications for daily life scenarios are provided.

  17. Analyzing Dynamic Task-Based Applications on Hybrid Platforms: An Agile Scripting Approach

    OpenAIRE

    Garcia Pinto, Vinicius; Stanisic, Luka; Legrand, Arnaud; Mello Schnorr, Lucas; Thibault, Samuel; Danjean, Vincent

    2016-01-01

    In this paper, we present visual analysis techniques to evaluate the performance of HPC task-based applications on hybrid architectures. Our approach is based on composing modern data analysis tools (pjdump, R, ggplot2, plotly), enabling an agile and flexible scripting framework with minor development cost. We validate our proposal by analyzing traces from the full-fledged implementation of the Cholesky decomposition available in the MORSE library running on a hybrid (CPU/GPU) platform. The a...

  18. Fuzzy hybrid MCDM approach for selection of wind turbine service technicians

    OpenAIRE

    Goutam Kumar Bose; Nikhil Chandra Chatterjee

    2016-01-01

    This research paper is aimed to present a fuzzy Hybrid Multi-criteria decision making (MCDM) methodology for selecting employees. The present study aspires to present the hybrid approach of Fuzzy multiple MCDM techniques with tactical viewpoint to support the recruitment process of wind turbine service technicians. The methodology is based on the application of Fuzzy ARAS (Additive Ratio Assessment) and Fuzzy MOORA (Multi-Objective Optimization on basis of Ratio Analysis) which are integrated...

  19. Control and fault diagnosis based sliding mode observer of a multicellular converter: Hybrid approach

    KAUST Repository

    Benzineb, Omar

    2013-01-01

    In this article, the diagnosis of a three cell converter is developed. The hybrid nature of the system represented by the presence of continuous and discrete dynamics is taken into account in the control design. The idea is based on using a hybrid control and an observer-type sliding mode to generate residuals from the observation errors of the system. The simulation results are presented at the end to illustrate the performance of the proposed approach. © 2013 FEI STU.

  20. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    CERN Document Server

    Abolfazli, Saeid; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, namely hardware and software. Generating high-end hardware is a subset of hardware augmentation approaches, whereas conserving local resource and reducing resource requirements approaches are grouped under software augmentation methods. Our study advocates that consreving smartphones' native resources, which is mainly done via task offloading, is more appropriate for already-developed applications than new ones, due to costly re-development process. Cloud computing has recently obtained momentous ground as one of the major co...

  1. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  2. What is intrinsic motivation? A typology of computational approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Oudeyer

    2009-11-01

    Full Text Available Intrinsic motivation, the causal mechanism for spontaneous exploration and curiosity, is a central concept in developmental psychology. It has been argued to be a crucial mechanism for open-ended cognitive development in humans, and as such has gathered a growing interest from developmental roboticists in the recent years. The goal of this paper is threefold. First, it provides a synthesis of the different approaches of intrinsic motivation in psychology. Second, by interpreting these approaches in a computational reinforcement learning framework, we argue that they are not operational and even sometimes inconsistent. Third, we set the ground for a systematic operational study of intrinsic motivation by presenting a formal typology of possible computational approaches. This typology is partly based on existing computational models, but also presents new ways of conceptualizing intrinsic motivation. We argue that this kind of computational typology might be useful for opening new avenues for research both in psychology and developmental robotics.

  3. Mixed model approaches for the identification of QTLs within a maize hybrid breeding program.

    NARCIS (Netherlands)

    Eeuwijk, van F.A.; Boer, M.; Totir, L.; Bink, M.C.A.M.; Wright, D.; Winkler, C.; Podlich, D.; Boldman, K.; Baumgarten, R.; Smalley, M.; Arbelbide, M.; Braak, ter C.J.F.; Cooper, M.

    2010-01-01

    Two outlines for mixed model based approaches to quantitative trait locus (QTL) mapping in existing maize hybrid selection programs are presented: a restricted maximum likelihood (REML) and a Bayesian Markov Chain Monte Carlo (MCMC) approach. The methods use the in-silico-mapping procedure developed

  4. An hybrid and non-modern approach to urban studies

    Directory of Open Access Journals (Sweden)

    Marc Grau i Solés

    2012-03-01

    Full Text Available This article draws upon the so-called Forat de la Vergonya urban controversy and the urban transformation process of a neighborhood in Barcelona: el Casc Antic. Drawing on inputs from Actor-Network Theory (ANT, the city is explored as a multiple urban assemblage. Besides, we analyze the dichotomous nature of the modern notion of politics. Especially, the role of object-subject dichotomy is explored. Through the analysis of citizen participation opportunities we propose a new hybrid notion of citizen participation and urban policy.

  5. Active diagnosis of hybrid systems - A model predictive approach

    DEFF Research Database (Denmark)

    Tabatabaeipour, Seyed Mojtaba; Ravn, Anders P.; Izadi-Zamanabadi, Roozbeh;

    2009-01-01

    A method for active diagnosis of hybrid systems is proposed. The main idea is to predict the future output of both normal and faulty model of the system; then at each time step an optimization problem is solved with the objective of maximizing the difference between the predicted normal and faulty...... outputs constrained by tolerable performance requirements. As in standard model predictive control, the first element of the optimal input is applied to the system and the whole procedure is repeated until the fault is detected by a passive diagnoser. It is demonstrated how the generated excitation signal...

  6. HYBRID AND INTEGRATED APPROACH TO SHORT TERM LOAD FORECASTING

    Directory of Open Access Journals (Sweden)

    J. P. Rothe,

    2010-12-01

    Full Text Available The forecasting of electricity demand has become one of the major research fields in Electrical Engineering. In recent years, much research has been carried out on the application of artificial intelligence techniques to the Load-Forecasting problem. Various Artificial Intelligence (AI techniques used for load forecasting are Expert systems, Fuzzy, Genetic Algorithm, Artificial Neural Network (ANN. This research work is an attempt to apply hybrid and ntegrated effort to forecast load. Regression, Fuzzy and Neural alongwith Genetic Algorithm will empower the analysts to strongly forecast fairly accurate load demand on hourly base.

  7. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  8. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    Science.gov (United States)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  9. Enhanced NLO response in BODIPY-coumarin hybrids: density functional theory approach

    Indian Academy of Sciences (India)

    YOGESH ERANDE; NAGAIYAN SEKAR

    2017-09-01

    We have thoroughly investigated the first, second and third polarizability characteristics of four hybrid chromophores by spectroscopic and computational methods. B3LYP, CAMB3LYP and BHandHLYP functionals in combination with 6-311+G(d,p) basis set were used to evaluate the polarizability and hyperpolarizability characteristics of these chromophores. Generalized Mulliken Hush analysis and frontier molecular orbital electronic distribution images of chromophores obtained from Density functional theory computation has established the charge transfer characteristics of these hybrid chromophores. On the basis of charge transfer characteristic, these red absorbing and NIR emissive chromophores possess high nonlinear optical response. Comparison of isolated units with their analogous hybrid chromophores shows that fusion ofcoumarin with BODIPY enhances the nonlinear optical response.

  10. A Hybrid Approach to Structure and Function Modeling of G Protein-Coupled Receptors.

    Science.gov (United States)

    Latek, Dorota; Bajda, Marek; Filipek, Sławomir

    2016-04-25

    The recent GPCR Dock 2013 assessment of serotonin receptor 5-HT1B and 5-HT2B, and smoothened receptor SMO targets, exposed the strengths and weaknesses of the currently used computational approaches. The test cases of 5-HT1B and 5-HT2B demonstrated that both the receptor structure and the ligand binding mode can be predicted with the atomic-detail accuracy, as long as the target-template sequence similarity is relatively high. On the other hand, the observation of a low target-template sequence similarity, e.g., between SMO from the frizzled GPCR family and members of the rhodopsin family, hampers the GPCR structure prediction and ligand docking. Indeed, in GPCR Dock 2013, accurate prediction of the SMO target was still beyond the capabilities of most research groups. Another bottleneck in the current GPCR research, as demonstrated by the 5-HT2B target, is the reliable prediction of global conformational changes induced by activation of GPCRs. In this work, we report details of our protocol used during GPCR Dock 2013. Our structure prediction and ligand docking protocol was especially successful in the case of 5-HT1B and 5-HT2B-ergotamine complexes for which we provide one of the most accurate predictions. In addition to a description of the GPCR Dock 2013 results, we propose a novel hybrid computational methodology to improve GPCR structure and function prediction. This computational methodology employs two separate rankings for filtering GPCR models. The first ranking is ligand-based while the second is based on the scoring scheme of the recently published BCL method. In this work, we prove that the use of knowledge-based potentials implemented in BCL is an efficient way to cope with major bottlenecks in the GPCR structure prediction. Thereby, we also demonstrate that the knowledge-based potentials for membrane proteins were significantly improved, because of the recent surge in available experimental structures.

  11. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia;

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The scope...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  12. 3D Medical Volume Segmentation Using Hybrid Multiresolution Statistical Approaches

    Directory of Open Access Journals (Sweden)

    Shadi AlZu'bi

    2010-01-01

    that 3D methodologies can accurately detect the Region Of Interest (ROI. Automatic segmentation has been achieved using HMMs where the ROI is detected accurately but suffers a long computation time for its calculations.

  13. CP-Miner : A hybrid Approach for Colorectal Polyp Detection

    OpenAIRE

    Ms. M. Vanitha,; Prof. P. Tamije Selvy; Dr. V.Palanisamy,; Prof. AR.Sivakumaran

    2010-01-01

    Computed Tomography Colonography (CTC) is the new generation technique for detecting colorectal polyps using volumetric CT data combined with Computer Aided Detection (CAD) system. The aim of this paper is to detail the implementation of a fully integrated CP-Minersystem that is able to identify the polyps in the CT data. The CP-Miner system has a multistage implementation whose main system components are: 1. Interpolation, 2. Automatic Colon Segmentation,3.Feature extraction, 4.Polyp Detecti...

  14. A hybrid approach to urine drug testing using high-resolution mass spectrometry and select immunoassays.

    Science.gov (United States)

    McMillin, Gwendolyn A; Marin, Stephanie J; Johnson-Davis, Kamisha L; Lawlor, Bryan G; Strathmann, Frederick G

    2015-02-01

    The major objective of this research was to propose a simplified approach for the evaluation of medication adherence in chronic pain management patients, using liquid chromatography time-of-flight (TOF) mass spectrometry, performed in parallel with select homogeneous enzyme immunoassays (HEIAs). We called it a "hybrid" approach to urine drug testing. The hybrid approach was defined based on anticipated positivity rates, availability of commercial reagents for HEIAs, and assay performance, particularly analytical sensitivity and specificity for drug(s) of interest. Subsequent to implementation of the hybrid approach, time to result was compared with that observed with other urine drug testing approaches. Opioids, benzodiazepines, zolpidem, amphetamine-like stimulants, and methylphenidate metabolite were detected by TOF mass spectrometry to maximize specificity and sensitivity of these 37 drug analytes. Barbiturates, cannabinoid metabolite, carisoprodol, cocaine metabolite, ethyl glucuronide, methadone, phencyclidine, propoxyphene, and tramadol were detected by HEIAs that performed adequately and/or for which positivity rates were very low. Time to result was significantly reduced compared with the traditional approach. The hybrid approach to urine drug testing provides a simplified and analytically specific testing process that minimizes the need for secondary confirmation. Copyright© by the American Society for Clinical Pathology.

  15. Contributions to Desktop Grid Computing : From High Throughput Computing to Data-Intensive Sciences on Hybrid Distributed Computing Infrastructures

    OpenAIRE

    Fedak, Gilles

    2015-01-01

    Since the mid 90’s, Desktop Grid Computing - i.e the idea of using a large number of remote PCs distributed on the Internet to execute large parallel applications - has proved to be an efficient paradigm to provide a large computational power at the fraction of the cost of a dedicated computing infrastructure.This document presents my contributions over the last decade to broaden the scope of Desktop Grid Computing. My research has followed three different directions. The first direction has ...

  16. Hybrid annealing using a quantum simulator coupled to a classical computer

    CERN Document Server

    Graß, Tobias

    2016-01-01

    Finding the global minimum in a rugged potential landscape is a computationally hard task, often equivalent to relevant optimization problems. Simulated annealing is a computational technique which explores the configuration space by mimicking thermal noise. By slow cooling, it freezes the system in a low-energy configuration, but the algorithm often gets stuck in local minima. In quantum annealing, the thermal noise is replaced by controllable quantum fluctuations, and the technique can be implemented in modern quantum simulators. However, quantum-adiabatic schemes become prohibitively slow in the presence of quasidegeneracies. Here we propose a strategy which combines ideas from simulated annealing and quantum annealing. In such hybrid algorithm, the outcome of a quantum simulator is processed on a classical device. While the quantum simulator explores the configuration space by repeatedly applying quantum fluctuations and performing projective measurements, the classical computer evaluates each configurati...

  17. Step Response Enhancement of Hybrid Stepper Motors Using Soft Computing Techniques

    Directory of Open Access Journals (Sweden)

    Amged S. El-Wakeel

    2014-05-01

    Full Text Available This paper presents the use of different soft computing techniques for step response enhancement of Hybrid Stepper Motors. The basic differential equations of hybrid stepper motor are used to build up a model using MATLAB software package. The implementation of Fuzzy Logic (FL and Proportional-Integral-Derivative (PID controllers are used to improve the motor performance. The numerical simulations by a PC-based controller show that the PID controller tuned by Genetic Algorithm (GA produces better performance than that tuned by Fuzzy controller. They show that, the Fuzzy PID-like controller produces better performance than the other linear Fuzzy controllers. Finally, the comparison between PID controllers tuned by genetic algorithm and the Fuzzy PID-like controller shows that, the Fuzzy PID-like controller produces better performance.

  18. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational...... thinking. We present two main theses on which the subject is based, and we present the included knowledge areas and didactical design principles. Finally we summarize the status and future plans for the subject and related development projects....

  19. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Maynard, Matthew R; Geyer, John W; Bolch, Wesley [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL (United States); Aris, John P [Department of Anatomy and Cell Biology, University of Florida, Gainesville, FL (United States); Shifrin, Roger Y, E-mail: wbolch@ufl.edu [Department of Radiology, University of Florida, Gainesville, FL (United States)

    2011-08-07

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations

  20. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    Science.gov (United States)

    Maynard, Matthew R.; Geyer, John W.; Aris, John P.; Shifrin, Roger Y.; Bolch, Wesley

    2011-08-01

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR™ and then imported to the 3D modeling software package Rhinoceros™ for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in

  1. Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Bindu

    2012-06-01

    Full Text Available One of the category of algorithm Problems are basically exponential problems. These problems are basically exponential problems and take time to find the solution. In the present work we are optimising one of the common NP complete problem called Travelling Salesman Problem. In our work we have defined a genetic approach by combining fuzzy approach along with genetics. In this work we have implemented the modified DPX crossover to improve genetic approach. The work is implemented in MATLAB environment and obtained results shows the define approach has optimized the existing genetic algorithm results

  2. A HYBRID APPROACH TO HUMAN SKIN REGION DETECTION

    Directory of Open Access Journals (Sweden)

    R. Vijayanandh

    2011-02-01

    Full Text Available Face recognition is important in research areas like machine vision and complex security systems. Skin region detection is a vital factor for processing in such systems. Hence the proposed paper focuses on isolating the regions of an image corresponding to human skin region through the hybrid method. This paper intends to combine the skin region detected from RGB and YCbCr color spaces image by the explicit skin color conditions and the skin label cluster identified from CIEL*a*b color space image, which is clustered by Hillclimbing segmentation with K-Means clustering algorithm. Then the resultant image is dilated by arbitrary shape and filtered by the median filter, in order to enhance the skin region and to avoid the noise respectively. The proposed method has been tested on various real images, which contain one or more human beings and the performance of skin region detection is found to be quite satisfactory.

  3. Assessing a Bayesian Approach for Detecting Exotic Hybrids between Plantation and Native Eucalypts

    Directory of Open Access Journals (Sweden)

    Matthew J. Larcombe

    2014-01-01

    Full Text Available Eucalyptus globulus is grown extensively in plantations outside its native range in Australia. Concerns have been raised that the species may pose a genetic risk to native eucalypt species through hybridisation and introgression. Methods for identifying hybrids are needed to enable assessment and management of this genetic risk. This paper assesses the efficiency of a Bayesian approach for identifying hybrids between the plantation species E. globulus and E. nitens and four at-risk native eucalypts. Range-wide DNA samples of E. camaldulensis, E. cypellocarpa, E. globulus, E. nitens, E. ovata and E. viminalis, and pedigreed and putative hybrids (n = 606, were genotyped with 10 microsatellite loci. Using a two-way simulation analysis (two species in the model at a time, the accuracy of identification was 98% for first and 93% for second generation hybrids. However, the accuracy of identifying simulated backcross hybrids was lower (74%. A six-way analysis (all species in the model together showed that as the number of species increases the accuracy of hybrid identification decreases. Despite some difficulties identifying backcrosses, the two-way Bayesian modelling approach was highly effective at identifying F1s, which, in the context of E. globulus plantations, are the primary management concern.

  4. Hybrid approach to data reduction for multi-sensor hot wires

    Science.gov (United States)

    Hooper, C. L.; Westphal, R. V.

    1991-01-01

    A hybrid approach to implementing the calibration equations for a multisensor hot-wire probe is discussed. The approach combines some of the speed of a look-up approach with the moderate storage requirements of direct calculation based on functional fitting. Particular attention is given to timing and storage comparisons for an X-wire probe. The method depends on the oft-employed concept of an effective cooling velocity which is a function only of the bridge output voltage.

  5. Computational and experimental determinations of the UV adsorption of polyvinylsilsesquioxane-silica and titanium dioxide hybrids.

    Science.gov (United States)

    Wang, Haiyan; Lin, Derong; Wang, Di; Hu, Lijiang; Huang, Yudong; Liu, Li; Loy, Douglas A

    2014-01-01

    Sunscreens that absorb UV light without photodegradation could reduce skin cancer. Polyvinyl silsesquioxanes are known to have greater thermal and photochemical stability than organic compounds, such as those in sunscreens. This paper evaluates the UV transparency of vinyl silsesquioxanes (VS) and its hybrids with SiO2(VSTE) and TiO2(VSTT) experimentally and computationally. Based on films of VS prepared by sol-gel polymerization, using benzoyl peroxide as an initiator, vinyltrimethoxysilane (VMS) formulated oligomer through thermal curing. Similarly, VSTE films were prepared from VMS and 5-25 wt-% tetraethoxysilane (TEOS) and VSTT films were prepared from VMS and 5-25 wt-% titanium tetrabutoxide (TTB). Experimental average transparencies of the modified films were found to be about 9-14% between 280-320 nm, 67-73% between 320-350nm, and 86-89% between 350-400nm. Computation of the band gap was absorption edges for the hybrids in excellent agreement with experimental data. VS, VSTE and VSTT showed good absorption in UV-C and UV-B range, but absorbed virtually no UV-A. Addition of SiO2 or TiO2 does not improve UV-B absorption, but on the opposite increases transparency of thin films to UV. This increase was validated with molecular simulations. Results show computational design can predict better sunscreens and reduce the effort of creating sunscreens that are capable of absorbing more UV-B and UV-A.

  6. Hybrid energy system evaluation in water supply system energy production: neural network approach

    Energy Technology Data Exchange (ETDEWEB)

    Goncalves, Fabio V.; Ramos, Helena M. [Civil Engineering Department, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais, 1049-001, Lisbon (Portugal); Reis, Luisa Fernanda R. [Universidade de Sao Paulo, EESC/USP, Departamento de Hidraulica e Saneamento., Avenida do Trabalhador Saocarlense, 400, Sao Carlos-SP (Brazil)

    2010-07-01

    Water supply systems are large consumers of energy and the use of hybrid systems for green energy production is this new proposal. This work presents a computational model based on neural networks to determine the best configuration of a hybrid system to generate energy in water supply systems. In this study the energy sources to make this hybrid system can be the national power grid, micro-hydro and wind turbines. The artificial neural network is composed of six layers, trained to use data generated by a model of hybrid configuration and an economic simulator - CES. The reason for the development of an advanced model of forecasting based on neural networks is to allow rapid simulation and proper interaction with hydraulic and power model simulator - HPS. The results show that this computational model is useful as advanced decision support system in the design of configurations of hybrid power systems applied to water supply systems, improving the solutions in the development of its global energy efficiency.

  7. Hybrid energy system evaluation in water supply system energy production: neural network approach

    Directory of Open Access Journals (Sweden)

    Fabio V. Goncalves, Helena M. Ramos, Luisa Fernanda R. Reis

    2010-01-01

    Full Text Available Water supply systems are large consumers of energy and the use of hybrid systems for green energy production is this new proposal. This work presents a computational model based on neural networks to determine the best configuration of a hybrid system to generate energy in water supply systems. In this study the energy sources to make this hybrid system can be the national power grid, micro-hydro and wind turbines. The artificial neural network is composed of six layers, trained to use data generated by a model of hybrid configuration and an economic simulator – CES. The reason for the development of an advanced model of forecasting based on neural networks is to allow rapid simulation and proper interaction with hydraulic and power model simulator – HPS. The results show that this computational model is useful as advanced decision support system in the design of configurations of hybrid power systems applied to water supply systems, improving the solutions in the development of its global energy efficiency.

  8. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  9. An approach to computing direction relations between separated object groups

    Science.gov (United States)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  10. A tale of three bio-inspired computational approaches

    Science.gov (United States)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  11. The Formal Approach to Computer Game Rule Development Automation

    OpenAIRE

    Elena, A

    2009-01-01

    Computer game rules development is one of the weakly automated tasks in game development. This paper gives an overview of the ongoing research project which deals with automation of rules development for turn-based strategy computer games. Rules are the basic elements of these games. This paper proposes a new approach to automation including visual formal rules model creation, model verification and modelbased code generation.

  12. The process group approach to reliable distributed computing

    Science.gov (United States)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  13. Public vs Private vs Hybrid vs Community - Cloud Computing: A Critical Review

    Directory of Open Access Journals (Sweden)

    Sumit Goyal

    2014-02-01

    Full Text Available These days cloud computing is booming like no other technology. Every organization whether it's small, mid-sized or big, wants to adapt this cutting edge technology for its business. As cloud technology becomes immensely popular among these businesses, the question arises: Which cloud model to consider for your business? There are four types of cloud models available in the market: Public, Private, Hybrid and Community. This review paper answers the question, which model would be most beneficial for your business. All the four models are defined, discussed and compared with the benefits and pitfalls, thus giving you a clear idea, which model to adopt for your organization.

  14. Quantum computation in a quantum-dot-Majorana-fermion hybrid system

    CERN Document Server

    Xue, Zheng-Yuan

    2012-01-01

    We propose a scheme to implement universal quantum computation in a quantum-dot-Majorana-fermion hybrid system. Quantum information is encoded on pairs of Majorana fermions, which live on the the interface between topologically trivial and nontrivial sections of a quantum nanowire deposited on an s-wave superconductor. Universal single-qubit gates on topological qubit can be achieved. A measurement-based two-qubit Controlled-Not gate is produced with the help of parity measurements assisted by the quantum-dot and followed by prescribed single-qubit gates. The parity measurement, on the quantum-dot and a topological qubit, is achieved by the Aharonov- Casher effect.

  15. Hybrid EEG-EOG brain-computer interface system for practical machine control.

    Science.gov (United States)

    Punsawad, Yunyong; Wongsawat, Yodchanan; Parnichkun, Manukid

    2010-01-01

    Practical issues such as accuracy with various subjects, number of sensors, and time for training are important problems of existing brain-computer interface (BCI) systems. In this paper, we propose a hybrid framework for the BCI system that can make machine control more practical. The electrooculogram (EOG) is employed to control the machine in the left and right directions while the electroencephalogram (EEG) is employed to control the forword, no action, and complete stop motions of the machine. By using only 2-channel biosignals, the average classification accuracy of more than 95% can be achieved.

  16. Treatment of early and late reflections in a hybrid computer model for room acoustics

    DEFF Research Database (Denmark)

    Naylor, Graham

    1992-01-01

    The ODEON computer model for acoustics in large rooms is intended for use both in design (by predicting room acoustical indices quickly and easily) and in research (by forming the basis of an auralization system and allowing study of various room acoustical phenomena). These conflicting demands...... preclude the use of both ``pure'' image source and ``pure'' particle tracing methods. A hybrid model has been developed, in which rays discover potential image sources up to a specified order. Thereafter, the same ray tracing process is used in a different way to rapidly generate a dense reverberant decay...

  17. Assessment of asthmatic inflammation using hybrid fluorescence molecular tomography-x-ray computed tomography

    Science.gov (United States)

    Ma, Xiaopeng; Prakash, Jaya; Ruscitti, Francesca; Glasl, Sarah; Stellari, Fabio Franco; Villetti, Gino; Ntziachristos, Vasilis

    2016-01-01

    Nuclear imaging plays a critical role in asthma research but is limited in its readings of biology due to the short-lived signals of radio-isotopes. We employed hybrid fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) for the assessment of asthmatic inflammation based on resolving cathepsin activity and matrix metalloproteinase activity in dust mite, ragweed, and Aspergillus species-challenged mice. The reconstructed multimodal fluorescence distribution showed good correspondence with ex vivo cryosection images and histological images, confirming FMT-XCT as an interesting alternative for asthma research.

  18. Large discreet resource allocation: a hybrid approach based on dea efficiency measurement

    Directory of Open Access Journals (Sweden)

    Eliane Gonçalves Gomes

    2008-12-01

    Full Text Available Resource allocation is one of the traditional Operations Research problems. In this paper we propose a hybrid model for resource allocation that uses Data Envelopment Analysis efficiency measures. We use Zero Sum Gains DEA models as the starting point to decrease the computational work for the step-bystep algorithm to allocate integer resources in a DEA context. Our approach is illustrated by a numerical example.A alocação de recursos é um dos problemas clássicos da Pesquisa Operacional. Neste artigo é proposto um modelo híbrido para alocar recursos, que faz uso de medidas de eficiência calculadas por Análise de Envoltória de Dados (DEA. São usados modelos DEA com Ganhos de Soma Zero como ponto de partida para reduzir o esforço computacional do algoritmo seqüencial para alocação de recursos discretos em DEA. A abordagem aqui proposta é aplicada a um exemplo numérico.

  19. An Event Driven Hybrid Identity Management Approach to Privacy Enhanced e-Health

    Directory of Open Access Journals (Sweden)

    Fabio Sanvido

    2012-05-01

    Full Text Available Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its  competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent—considered as a privacy rule in sensitive scenarios—has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism.

  20. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  1. A distributed computing approach to mission operations support. [for spacecraft

    Science.gov (United States)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  2. A Low Cost, Hybrid Approach to Data Mining Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort will combine a low cost physical modeling approach with inductive, data-centered modeling in an aerosopace relevant context to demonstrate...

  3. Hybrid input function estimation using a single-input-multiple-output (SIMO) approach

    Science.gov (United States)

    Su, Yi; Shoghi, Kooresh I.

    2009-02-01

    A hybrid blood input function (BIF) model that incorporates region of interests (ROIs) based peak estimation and a two exponential tail model was proposed to describe the blood input function. The hybrid BIF model was applied to the single-input-multiple-output (SIMO) optimization based approach for BIF estimation using time activity curves (TACs) obtained from ROIs defined at left ventricle (LV) blood pool and myocardium regions of dynamic PET images. The proposed BIF estimation method was applied with 0, 1 and 2 blood samples as constraints for BIF estimation using simulated small animal PET data. Relative percentage difference of the area-under-curve (AUC) measurement between the estimated BIF and the true BIF was calculated to evaluate the BIF estimation accuracy. SIMO based BIF estimation using Feng's input function model was also applied for comparison. The hybrid method provided improved BIF estimation in terms of both mean accuracy and variability compared to Feng's model based BIF estimation in our simulation study. When two blood samples were used as constraints, the percentage BIF estimation error was 0.82 +/- 4.32% for the hybrid approach and 4.63 +/- 10.67% for the Feng's model based approach. Using hybrid BIF, improved kinetic parameter estimation was also obtained.

  4. Vehicle height and posture control of the electronic air suspension system using the hybrid system approach

    Science.gov (United States)

    Sun, Xiaoqiang; Cai, Yingfeng; Chen, Long; Liu, Yanling; Wang, Shaohua

    2016-03-01

    The electronic air suspension (EAS) system can improve ride comfort, fuel economy and handling safety of vehicles by adjusting vehicle height. This paper describes the development of a novel controller using the hybrid system approach to adjust the vehicle height (height control) and to regulate the roll and pitch angles of the vehicle body during the height adjustment process (posture control). The vehicle height adjustment system of EAS poses challenging hybrid control problems, since it features different discrete modes of operation, where each mode has an associated linear continuous-time dynamic. In this paper, we propose a novel approach to the modelling and controller design problem for the vehicle height adjustment system of EAS. The system model is described firstly in the hybrid system description language (HYSDEL) to obtain a mixed logical dynamical (MLD) hybrid model. For the resulting model, a hybrid model predictive controller is tuned to improve the vehicle height and posture tracking accuracy and to achieve the on-off statuses direct control of solenoid valves. The effectiveness and performance of the proposed approach are demonstrated by simulations and actual vehicle tests.

  5. CP-Miner : A hybrid Approach for Colorectal Polyp Detection

    Directory of Open Access Journals (Sweden)

    Ms. M. Vanitha,

    2010-05-01

    Full Text Available Computed Tomography Colonography (CTC is the new generation technique for detecting colorectal polyps using volumetric CT data combined with Computer Aided Detection (CAD system. The aim of this paper is to detail the implementation of a fully integrated CP-Minersystem that is able to identify the polyps in the CT data. The CP-Miner system has a multistage implementation whose main system components are: 1. Interpolation, 2. Automatic Colon Segmentation,3.Feature extraction, 4.Polyp Detection. The proposed system has the aim to reduce the number of false positives than the existing system. The developed system provides 100% sensitivity for polyps greater than 6mm, provides 83.33% sensitivity for polyps within 3 to 6mm, 69.23% sensitivity for polyps less than 3mm. The overall sensitivityof the CP-Miner system is 84.18%.

  6. Hybrid Enhanced Epidermal SpaceSuit Design Approaches

    Science.gov (United States)

    Jessup, Joseph M.

    A Space suit that does not rely on gas pressurization is a multi-faceted problem that requires major stability controls to be incorporated during design and construction. The concept of Hybrid Epidermal Enhancement space suit integrates evolved human anthropomorphic and physiological adaptations into its functionality, using commercially available bio-medical technologies to address shortcomings of conventional gas pressure suits, and the impracticalities of MCP suits. The prototype HEE Space Suit explored integumentary homeostasis, thermal control and mobility using advanced bio-medical materials technology and construction concepts. The goal was a space suit that functions as an enhanced, multi-functional bio-mimic of the human epidermal layer that works in attunement with the wearer rather than as a separate system. In addressing human physiological requirements for design and construction of the HEE suit, testing regimes were devised and integrated into the prototype which was then subject to a series of detailed tests using both anatomical reproduction methods and human subject.

  7. Hybrid Approach to State Estimation for Bioprocess Control

    Directory of Open Access Journals (Sweden)

    Rimvydas Simutis

    2017-03-01

    Full Text Available An improved state estimation technique for bioprocess control applications is proposed where a hybrid version of the Unscented Kalman Filter (UKF is employed. The underlying dynamic system model is formulated as a conventional system of ordinary differential equations based on the mass balances of the state variables biomass, substrate, and product, while the observation model, describing the less established relationship between the state variables and the measurement quantities, is formulated in a data driven way. The latter is formulated by means of a support vector regression (SVR model. The UKF is applied to a recombinant therapeutic protein production process using Escherichia coli bacteria. Additionally, the state vector was extended by the specific biomass growth rate µ in order to allow for the estimation of this key variable which is crucial for the implementation of innovative control algorithms in recombinant therapeutic protein production processes. The state estimates depict a sufficiently low noise level which goes perfectly with different advanced bioprocess control applications.

  8. Hybrid Approach to State Estimation for Bioprocess Control

    Directory of Open Access Journals (Sweden)

    Rimvydas Simutis

    2017-03-01

    Full Text Available An improved state estimation technique for bioprocess control applications is proposed where a hybrid version of the Unscented Kalman Filter (UKF is employed. The underlying dynamic system model is formulated as a conventional system of ordinary differential equations based on the mass balances of the state variables biomass, substrate, and product, while the observation model, describing the less established relationship between the state variables and the measurement quantities, is formulated in a data driven way. The latter is formulated by means of a support vector regression (SVR model. The UKF is applied to a recombinant therapeutic protein production process using Escherichia coli bacteria. Additionally, the state vector was extended by the specific biomass growth rate µ in order to allow for the estimation of this key variable which is crucial for the implementation of innovative control algorithms in recombinant therapeutic protein production processes. The state estimates depict a sufficiently low noise level which goes perfectly with different advanced bioprocess control applications.

  9. Hybrid Numerical Solvers for Massively Parallel Eigenvalue Computation and Their Benchmark with Electronic Structure Calculations

    CERN Document Server

    Imachi, Hiroto

    2015-01-01

    Optimally hybrid numerical solvers were constructed for massively parallel generalized eigenvalue problem (GEP).The strong scaling benchmark was carried out on the K computer and other supercomputers for electronic structure calculation problems in the matrix sizes of M = 10^4-10^6 with upto 105 cores. The procedure of GEP is decomposed into the two subprocedures of the reducer to the standard eigenvalue problem (SEP) and the solver of SEP. A hybrid solver is constructed, when a routine is chosen for each subprocedure from the three parallel solver libraries of ScaLAPACK, ELPA and EigenExa. The hybrid solvers with the two newer libraries, ELPA and EigenExa, give better benchmark results than the conventional ScaLAPACK library. The detailed analysis on the results implies that the reducer can be a bottleneck in next-generation (exa-scale) supercomputers, which indicates the guidance for future research. The code was developed as a middleware and a mini-application and will appear online.

  10. Fast 3d Hybrid Seismic Modeling: Ray-fd Approach For Elastic Models With Locally Complex Structures

    Science.gov (United States)

    Oprsal, I.; Brokesova, J.; Faeh, D.; Giardini, D.

    Hybrid approaches may find broad applications wherever full source, path,and site effects modeling methods are too expensive. A new efficient hybrid method allowing to compute seismic wavefield in large 3D elastic models containing a complex local structure embedded in a large, but considerably simpler, structure is designed. This hybrid method combines the ray approach in the large simple structure with the finite difference (FD) approach in the local complex structure. The hybrid method is based on two successive steps. In the 1st one, the source and path information is carried by wavefield propagating in the large simple structure. This wavefield, calculated by the ray method, is incident at the points along a two-fold formal boundary (excitation box, EB) surrounding that part of the model which is to be replaced by the complex medium in the 2nd step. 3D rays are necessary due to ar- bitrary source-EB configuration, even in case the 1st step structure is less dimensional (2D, 1D, homogeneous). Along EB, the ray endpoints may be distributed sparsely thanks to relative simplicity of the structure. This reduces computer time requirements and also the size of the excitation file saved on the disk. The ray wavefield along EB provides (after interpolation in space and time) the input for the second step consisting in calculating the complete wavefield by the 3D FD method on irregular grids. The FD computational domain contains the EB and its close vicinity. The 2nd step model differs from the 1st step model only inside the EB where the local complex structure is inserted. To verify the consistency between the 1st and the 2nd step binding, the 2nd step computation can be performed on (unchanged) 1st step model ('replication test'). This should give the same wavefield as the 1st step inside, and zero wavefield outside the EB. The EB remains fully permeable for all waves propagating within the FD domain. Provided the 1st step structure does not contain too many layers

  11. Research related to improved computer aided design software package. [comparative efficiency of finite, boundary, and hybrid element methods in elastostatics

    Science.gov (United States)

    Walston, W. H., Jr.

    1986-01-01

    The comparative computational efficiencies of the finite element (FEM), boundary element (BEM), and hybrid boundary element-finite element (HVFEM) analysis techniques are evaluated for representative bounded domain interior and unbounded domain exterior problems in elastostatics. Computational efficiency is carefully defined in this study as the computer time required to attain a specified level of solution accuracy. The study found the FEM superior to the BEM for the interior problem, while the reverse was true for the exterior problem. The hybrid analysis technique was found to be comparable or superior to both the FEM and BEM for both the interior and exterior problems.

  12. Computational analysis of electrical conduction in hybrid nanomaterials with embedded non-penetrating conductive particles

    Science.gov (United States)

    Cai, Jizhe; Naraghi, Mohammad

    2016-08-01

    In this work, a comprehensive multi-resolution two-dimensional (2D) resistor network model is proposed to analyze the electrical conductivity of hybrid nanomaterials made of insulating matrix with conductive particles such as CNT reinforced nanocomposites and thick film resistors. Unlike existing approaches, our model takes into account the impenetrability of the particles and their random placement within the matrix. Moreover, our model presents a detailed description of intra-particle conductivity via finite element analysis, which to the authors’ best knowledge has not been addressed before. The inter-particle conductivity is assumed to be primarily due to electron tunneling. The model is then used to predict the electrical conductivity of electrospun carbon nanofibers as a function of microstructural parameters such as turbostratic domain alignment and aspect ratio. To simulate the microstructure of single CNF, randomly positioned nucleation sites were seeded and grown as turbostratic particles with anisotropic growth rates. Particle growth was in steps and growth of each particle in each direction was stopped upon contact with other particles. The study points to the significant contribution of both intra-particle and inter-particle conductivity to the overall conductivity of hybrid composites. Influence of particle alignment and anisotropic growth rate ratio on electrical conductivity is also discussed. The results show that partial alignment in contrast to complete alignment can result in maximum electrical conductivity of whole CNF. High degrees of alignment can adversely affect conductivity by lowering the probability of the formation of a conductive path. The results demonstrate approaches to enhance electrical conductivity of hybrid materials through controlling their microstructure which is applicable not only to carbon nanofibers, but also many other types of hybrid composites such as thick film resistors.

  13. Hybrid brain-computer interfaces and hybrid neuroprostheses for restoration of upper limb functions in individuals with high-level spinal cord injury.

    Science.gov (United States)

    Rohm, Martin; Schneiders, Matthias; Müller, Constantin; Kreilinger, Alex; Kaiser, Vera; Müller-Putz, Gernot R; Rupp, Rüdiger

    2013-10-01

    The bilateral loss of the grasp function associated with a lesion of the cervical spinal cord severely limits the affected individuals' ability to live independently and return to gainful employment after sustaining a spinal cord injury (SCI). Any improvement in lost or limited grasp function is highly desirable. With current neuroprostheses, relevant improvements can be achieved in end users with preserved shoulder and elbow, but missing hand function. The aim of this single case study is to show that (1) with the support of hybrid neuroprostheses combining functional electrical stimulation (FES) with orthoses, restoration of hand, finger and elbow function is possible in users with high-level SCI and (2) shared control principles can be effectively used to allow for a brain-computer interface (BCI) control, even if only moderate BCI performance is achieved after extensive training. The individual in this study is a right-handed 41-year-old man who sustained a traumatic SCI in 2009 and has a complete motor and sensory lesion at the level of C4. He is unable to generate functionally relevant movements of the elbow, hand and fingers on either side. He underwent extensive FES training (30-45min, 2-3 times per week for 6 months) and motor imagery (MI) BCI training (415 runs in 43 sessions over 12 months). To meet individual needs, the system was designed in a modular fashion including an intelligent control approach encompassing two input modalities, namely an MI-BCI and shoulder movements. After one year of training, the end user's MI-BCI performance ranged from 50% to 93% (average: 70.5%). The performance of the hybrid system was evaluated with different functional assessments. The user was able to transfer objects of the grasp-and-release-test and he succeeded in eating a pretzel stick, signing a document and eating an ice cream cone, which he was unable to do without the system. This proof-of-concept study has demonstrated that with the support of hybrid FES

  14. DyHAP: Dynamic Hybrid ANFIS-PSO Approach for Predicting Mobile Malware.

    Science.gov (United States)

    Afifi, Firdaus; Anuar, Nor Badrul; Shamshirband, Shahaboddin; Choo, Kim-Kwang Raymond

    2016-01-01

    To deal with the large number of malicious mobile applications (e.g. mobile malware), a number of malware detection systems have been proposed in the literature. In this paper, we propose a hybrid method to find the optimum parameters that can be used to facilitate mobile malware identification. We also present a multi agent system architecture comprising three system agents (i.e. sniffer, extraction and selection agent) to capture and manage the pcap file for data preparation phase. In our hybrid approach, we combine an adaptive neuro fuzzy inference system (ANFIS) and particle swarm optimization (PSO). Evaluations using data captured on a real-world Android device and the MalGenome dataset demonstrate the effectiveness of our approach, in comparison to two hybrid optimization methods which are differential evolution (ANFIS-DE) and ant colony optimization (ANFIS-ACO).

  15. DyHAP: Dynamic Hybrid ANFIS-PSO Approach for Predicting Mobile Malware

    Science.gov (United States)

    Afifi, Firdaus; Anuar, Nor Badrul; Shamshirband, Shahaboddin

    2016-01-01

    To deal with the large number of malicious mobile applications (e.g. mobile malware), a number of malware detection systems have been proposed in the literature. In this paper, we propose a hybrid method to find the optimum parameters that can be used to facilitate mobile malware identification. We also present a multi agent system architecture comprising three system agents (i.e. sniffer, extraction and selection agent) to capture and manage the pcap file for data preparation phase. In our hybrid approach, we combine an adaptive neuro fuzzy inference system (ANFIS) and particle swarm optimization (PSO). Evaluations using data captured on a real-world Android device and the MalGenome dataset demonstrate the effectiveness of our approach, in comparison to two hybrid optimization methods which are differential evolution (ANFIS-DE) and ant colony optimization (ANFIS-ACO). PMID:27611312

  16. A Hybrid Metaheuristic-Based Approach for the Aerodynamic Optimization of Small Hybrid Wind Turbine Rotors

    Directory of Open Access Journals (Sweden)

    José F. Herbert-Acero

    2014-01-01

    Full Text Available This work presents a novel framework for the aerodynamic design and optimization of blades for small horizontal axis wind turbines (WT. The framework is based on a state-of-the-art blade element momentum model, which is complemented with the XFOIL 6.96 software in order to provide an estimate of the sectional blade aerodynamics. The framework considers an innovative nested-hybrid solution procedure based on two metaheuristics, the virtual gene genetic algorithm and the simulated annealing algorithm, to provide a near-optimal solution to the problem. The objective of the study is to maximize the aerodynamic efficiency of small WT (SWT rotors for a wide range of operational conditions. The design variables are (1 the airfoil shape at the different blade span positions and the radial variation of the geometrical variables of (2 chord length, (3 twist angle, and (4 thickness along the blade span. A wind tunnel validation study of optimized rotors based on the NACA 4-digit airfoil series is presented. Based on the experimental data, improvements in terms of the aerodynamic efficiency, the cut-in wind speed, and the amount of material used during the manufacturing process were achieved. Recommendations for the aerodynamic design of SWT rotors are provided based on field experience.

  17. PWR hybrid computer model for assessing the safety implications of control systems

    Energy Technology Data Exchange (ETDEWEB)

    Smith, O L; Renier, J P; Difilippo, F C; Clapp, N E; Sozer, A; Booth, R S; Craddick, W G; Morris, D G

    1986-03-01

    The ORNL study of safety-related aspects of nuclear power plant control systems consists of two interrelated tasks: (1) failure mode and effects analysis (FMEA) that identified single and multiple component failures that might lead to significant plant upsets and (2) computer models that used these failures as initial conditions and traced the dynamic impact on the control system and remainder of the plant. This report describes the simulation of Oconee Unit 1, the first plant analyzed. A first-principles, best-estimate model was developed and implemented on a hybrid computer consisting of AD-4 analog and PDP-10 digital machines. Controls were placed primarily on the analog to use its interactive capability to simulate operator action. 48 refs., 138 figs., 15 tabs.

  18. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    Science.gov (United States)

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  19. Hybrid approach for attenuation correction in PET/MR scanners

    Energy Technology Data Exchange (ETDEWEB)

    Santos Ribeiro, A., E-mail: afribeiro@fc.ul.pt [Institute of Biophysics and Biomedical Engineering, Faculty of Sciences, University of Lisbon, Lisbon (Portugal); Rota Kops, E.; Herzog, H. [Institute of Neuroscience and Medicine, Forschungszentrum Juelich, Juelich (Germany); Almeida, P. [Institute of Biophysics and Biomedical Engineering, Faculty of Sciences, University of Lisbon, Lisbon (Portugal)

    2014-01-11

    Aim: Attenuation correction (AC) of PET images is still one of the major limitations of hybrid PET/MR scanners. Different methods have been proposed to obtain the AC map from morphological MR images. Although, segmentation methods normally fail to differentiate air and bone regions, while template or atlas methods usually cannot accurately represent regions anatomically different from the template image. In this study a feed forward neural network (FFNN) algorithm is presented which directly outputs the attenuation coefficients by non-linear regression of the images acquired with an ultrashort echo time (UTE) sequence guided by the template-based AC map (TAC-map). Materials and methods: MR as well as CT data were acquired in four subjects. The UTE images and the TAC-map were the inputs of the presented FFNN algorithm for training as well as classification. The resulting attenuation maps were compared with CT-based, PNN-based and TAC maps. All the AC maps were used to reconstruct the PET emission data which were then compared for the different methods. Results: For each subject dice coefficients D were calculated between each method and the respective CT-based AC maps. The resulting Ds show higher values for all FFNN-based tissues comparatively to both TAC-based and PNN-based methods, particularly for bone tissue (D=0.77, D=0.51 and D=0.71, respectively). The AC-corrected PET images with the FFNN-based map show an overall lower relative difference (RD=3.90%) than those AC-corrected with the PNN-based (RD=4.44%) or template-based (RD=4.43%) methods. Conclusion: Our results show that an enhancement of current methods can be performed by combining both information of new MR image sequence techniques and general information provided from template techniques. Nevertheless, the number of tested subjects is statistically low and current analysis for a larger dataset is being carried out.

  20. The UF family of reference hybrid phantoms for computational radiation dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Choonsik [Division of Cancer Epidemiology and Genetics, National Cancer Institute, National Institute of Health, Bethesda, MD 20852 (United States); Lodwick, Daniel; Hurtado, Jorge; Pafundi, Deanna [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Williams, Jonathan L [Department of Radiology, University of Florida, Gainesville, FL 32611 (United States); Bolch, Wesley E [Departments of Nuclear and Radiological and Biomedical Engineering, University of Florida, Gainesville, FL 32611 (United States)], E-mail: wbolch@ufl.edu

    2010-01-21

    Computational human phantoms are computer models used to obtain dose distributions within the human body exposed to internal or external radiation sources. In addition, they are increasingly used to develop detector efficiencies for in vivo whole-body counters. Two classes of computational human phantoms have been widely utilized for dosimetry calculation: stylized and voxel phantoms that describe human anatomy through mathematical surface equations and 3D voxel matrices, respectively. Stylized phantoms are flexible in that changes to organ position and shape are possible given avoidance of region overlap, while voxel phantoms are typically fixed to a given patient anatomy, yet can be proportionally scaled to match individuals of larger or smaller stature, but of equivalent organ anatomy. Voxel phantoms provide much better anatomical realism as compared to stylized phantoms which are intrinsically limited by mathematical surface equations. To address the drawbacks of these phantoms, hybrid phantoms based on non-uniform rational B-spline (NURBS) surfaces have been introduced wherein anthropomorphic flexibility and anatomic realism are both preserved. Researchers at the University of Florida have introduced a series of hybrid phantoms representing the ICRP Publication 89 reference newborn, 15 year, and adult male and female. In this study, six additional phantoms are added to the UF family of hybrid phantoms-those of the reference 1 year, 5 year and 10 year child. Head and torso CT images of patients whose ages were close to the targeted ages were obtained under approved protocols. Major organs and tissues were segmented from these images using an image processing software, 3D-DOCTOR(TM). NURBS and polygon mesh surfaces were then used to model individual organs and tissues after importing the segmented organ models to the 3D NURBS modeling software, Rhinoceros(TM). The phantoms were matched to four reference datasets: (1) standard anthropometric data, (2) reference

  1. A multidisciplinary approach to solving computer related vision problems.

    Science.gov (United States)

    Long, Jennifer; Helland, Magne

    2012-09-01

    This paper proposes a multidisciplinary approach to solving computer related vision issues by including optometry as a part of the problem-solving team. Computer workstation design is increasing in complexity. There are at least ten different professions who contribute to workstation design or who provide advice to improve worker comfort, safety and efficiency. Optometrists have a role identifying and solving computer-related vision issues and in prescribing appropriate optical devices. However, it is possible that advice given by optometrists to improve visual comfort may conflict with other requirements and demands within the workplace. A multidisciplinary approach has been advocated for solving computer related vision issues. There are opportunities for optometrists to collaborate with ergonomists, who coordinate information from physical, cognitive and organisational disciplines to enact holistic solutions to problems. This paper proposes a model of collaboration and examples of successful partnerships at a number of professional levels including individual relationships between optometrists and ergonomists when they have mutual clients/patients, in undergraduate and postgraduate education and in research. There is also scope for dialogue between optometry and ergonomics professional associations. A multidisciplinary approach offers the opportunity to solve vision related computer issues in a cohesive, rather than fragmented way. Further exploration is required to understand the barriers to these professional relationships. © 2012 The College of Optometrists.

  2. BF-PSO-TS: Hybrid Heuristic Algorithms for Optimizing Task Schedulingon Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Hussin M. Alkhashai

    2016-06-01

    Full Text Available Task Scheduling is a major problem in Cloud computing because the cloud provider has to serve many users. Also, a good scheduling algorithm helps in the proper and efficient utilization of the resources. So, task scheduling is considered as one of the major issues on the Cloud computing systems. The objective of this paper is to assign the tasks to multiple computing resources. Consequently, the total cost of execution is to be minimum and load to be shared between these computing resources. Therefore, two hybrid algorithms based on Particle Swarm Optimization (PSO have been introduced to schedule the tasks; Best-Fit-PSO (BFPSO and PSO-Tabu Search (PSOTS. According to BFPSO algorithm, Best-Fit (BF algorithm has been merged into the PSO algorithm to improve the performance. The main principle of the modified BFSOP algorithm is that BF algorithm is used to generate the initial population of the standard PSO algorithm instead of being initiated randomly. According to the proposed PSOTS algorithm, the Tabu-Search (TS has been used to improve the local research by avoiding the trap of the local optimality which could be occurred using the standard PSO algorithm. The two proposed algorithms (i.e., BFPSO and PSOTS have been implemented using Cloudsim and evaluated comparing to the standard PSO algorithm using five problems with different number of independent tasks and resources. The performance parameters have been considered are the execution time (Makspan, cost, and resources utilization. The implementation results prove that the proposed hybrid algorithms (i.e., BFPSO, PSOTS outperform the standard PSO algorithm.

  3. Application of a single-objective, hybrid genetic algorithm approach to pharmacokinetic model building.

    Science.gov (United States)

    Sherer, Eric A; Sale, Mark E; Pollock, Bruce G; Belani, Chandra P; Egorin, Merrill J; Ivy, Percy S; Lieberman, Jeffrey A; Manuck, Stephen B; Marder, Stephen R; Muldoon, Matthew F; Scher, Howard I; Solit, David B; Bies, Robert R

    2012-08-01

    A limitation in traditional stepwise population pharmacokinetic model building is the difficulty in handling interactions between model components. To address this issue, a method was previously introduced which couples NONMEM parameter estimation and model fitness evaluation to a single-objective, hybrid genetic algorithm for global optimization of the model structure. In this study, the generalizability of this approach for pharmacokinetic model building is evaluated by comparing (1) correct and spurious covariate relationships in a simulated dataset resulting from automated stepwise covariate modeling, Lasso methods, and single-objective hybrid genetic algorithm approaches to covariate identification and (2) information criteria values, model structures, convergence, and model parameter values resulting from manual stepwise versus single-objective, hybrid genetic algorithm approaches to model building for seven compounds. Both manual stepwise and single-objective, hybrid genetic algorithm approaches to model building were applied, blinded to the results of the other approach, for selection of the compartment structure as well as inclusion and model form of inter-individual and inter-occasion variability, residual error, and covariates from a common set of model options. For the simulated dataset, stepwise covariate modeling identified three of four true covariates and two spurious covariates; Lasso identified two of four true and 0 spurious covariates; and the single-objective, hybrid genetic algorithm identified three of four true covariates and one spurious covariate. For the clinical datasets, the Akaike information criterion was a median of 22.3 points lower (range of 470.5 point decrease to 0.1 point decrease) for the best single-objective hybrid genetic-algorithm candidate model versus the final manual stepwise model: the Akaike information criterion was lower by greater than 10 points for four compounds and differed by less than 10 points for three

  4. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  5. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  6. A hybrid nudging-ensemble Kalman filter approach to data assimilation. Part I: application in the Lorenz system

    Directory of Open Access Journals (Sweden)

    Lili Lei

    2012-05-01

    Full Text Available A hybrid data assimilation approach combining nudging and the ensemble Kalman filter (EnKF for dynamic analysis and numerical weather prediction is explored here using the non-linear Lorenz three-variable model system with the goal of a smooth, continuous and accurate data assimilation. The hybrid nudging-EnKF (HNEnKF computes the hybrid nudging coefficients from the flow-dependent, time-varying error covariance matrix from the EnKF's ensemble forecasts. It extends the standard diagonal nudging terms to additional off-diagonal statistical correlation terms for greater inter-variable influence of the innovations in the model's predictive equations to assist in the data assimilation process. The HNEnKF promotes a better fit of an analysis to data compared to that achieved by either nudging or incremental analysis update (IAU. When model error is introduced, it produces similar or better root mean square errors compared to the EnKF while minimising the error spikes/discontinuities created by the intermittent EnKF. It provides a continuous data assimilation with better inter-variable consistency and improved temporal smoothness than that of the EnKF. Data assimilation experiments are also compared to the ensemble Kalman smoother (EnKS. The HNEnKF has similar or better temporal smoothness than that of the EnKS, and with much smaller central processing unit (CPU time and data storage requirements.

  7. Assessing Trustworthiness in Social Media: A Social Computing Approach

    Science.gov (United States)

    2015-11-17

    31-May-2015 Approved for Public Release; Distribution Unlimited Final Report: Assessing Trustworthiness in Social Media : A Social Computing Approach... media . We propose to investigate research issues related to social media trustworthiness and its assessment by leveraging social research methods...attributes of interest associated with a particular social media user related to the received information. This tool provides a way to combine different

  8. A Unitifed Computational Approach to Oxide Aging Processes

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, D.J.; Fleetwood, D.M.; Hjalmarson, H.P.; Schultz, P.A.

    1999-01-27

    In this paper we describe a unified, hierarchical computational approach to aging and reliability problems caused by materials changes in the oxide layers of Si-based microelectronic devices. We apply this method to a particular low-dose-rate radiation effects problem

  9. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  10. Hybrid systems modelling and simulation in DESTECS: a co-simulation approach

    NARCIS (Netherlands)

    Ni, Yunyun; Broenink, Johannes F.; Klumpp, M.

    2012-01-01

    This paper introduces the modelling methodology and tooling in DESTECS (www.destecs.org) - Design Support and Tooling for Embedded Control Software - project as a novel modelling approach for hybrid systems from an executable model perspective. It provides a top-level structure for the system model

  11. Students' Game Performance Improvements during a Hybrid Sport Education-Step-Game-Approach Volleyball Unit

    Science.gov (United States)

    Araújo, Rui; Mesquita, Isabel; Hastie, Peter; Pereira, Cristiana

    2016-01-01

    The purpose of this study was to examine a hybrid combination of sport education and the step-game-approach (SGA) on students' gameplay performance in volleyball, taking into account their sex and skill-level. Seventeen seventh-grade students (seven girls, 10 boys, average age 11.8) participated in a 25-lesson volleyball season, in which the…

  12. On the Use of Hybrid Development Approaches in Software and Systems Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Münch, Jürgen; Diebold, Philipp;

    2016-01-01

    embody this framework with more agile (and/or lean) practices to keep their flexibility. The paper at hand provides insights into the HELENA study with which we aim to investigate the use of “Hybrid dEveLopmENt Approaches in software systems development”. We present the survey design and initial findings...

  13. The business case for condition-based maintenance: a hybrid (non-) financial approach

    NARCIS (Netherlands)

    Tiddens, W.W.; Tinga, T.; Braaksma, A.J.J.; Brouwer, O.; Cepin, Marko; Bris, Radim

    2017-01-01

    Although developing business cases is key for evaluating project success, the costs and benefits of condition-based maintenance (CBM) implementations are often not explicitly defined and evaluated. Using the design science methodology, we developed a hybrid business case approach to help managers

  14. Students' Game Performance Improvements during a Hybrid Sport Education-Step-Game-Approach Volleyball Unit

    Science.gov (United States)

    Araújo, Rui; Mesquita, Isabel; Hastie, Peter; Pereira, Cristiana

    2016-01-01

    The purpose of this study was to examine a hybrid combination of sport education and the step-game-approach (SGA) on students' gameplay performance in volleyball, taking into account their sex and skill-level. Seventeen seventh-grade students (seven girls, 10 boys, average age 11.8) participated in a 25-lesson volleyball season, in which the…

  15. A Hybrid Approach to Combine Physically Based and Data-Driven Models in Simulating Sediment Transportation

    NARCIS (Netherlands)

    Sewagudde, S.

    2008-01-01

    The objective of this study is to develop a methodology for hybrid modelling of sedimentation in a coastal basin or large shallow lake where physically based and data driven approaches are combined. This research was broken down into three blocks. The first block explores the possibility of approxim

  16. A Hybrid Column Generation approach for an Industrial Waste Collection Routing Problem

    DEFF Research Database (Denmark)

    Hauge, Kristian; Larsen, Jesper; Lusby, Richard Martin;

    2014-01-01

    , real-world problem instances. Results indicate that the hybrid column generation outperforms a purely heuristic approach in terms of both running time and solution quality. High quality solutions to problems containing up to 100 orders can be solved in approximately 15 minutes....

  17. A Computationally Based Approach to Homogenizing Advanced Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  18. The study of hybrid model identification,computation analysis and fault location for nonlinear dynamic circuits and systems

    Institute of Scientific and Technical Information of China (English)

    XIE Hong; HE Yi-gang; ZENG Guan-da

    2006-01-01

    This paper presents the hybrid model identification for a class of nonlinear circuits and systems via a combination of the block-pulse function transform with the Volterra series.After discussing the method to establish the hybrid model and introducing the hybrid model identification,a set of relative formulas are derived for calculating the hybrid model and computing the Volterra series solution of nonlinear dynamic circuits and systems.In order to significantly reduce the computation cost for fault location,the paper presents a new fault diagnosis method based on multiple preset models that can be realized online.An example of identification simulation and fault diagnosis are given.Results show that the method has high accuracy and efficiency for fault location of nonlinear dynamic circuits and systems.

  19. A hybrid approach for addressing ring flexibility in 3D database searching.

    Science.gov (United States)

    Sadowski, J

    1997-01-01

    A hybrid approach for flexible 3D database searching is presented that addresses the problem of ring flexibility. It combines the explicit storage of up to 25 multiple conformations of rings, with up to eight atoms, generated by the 3D structure generator CORINA with the power of a torsional fitting technique implemented in the 3D database system UNITY. A comparison with the original UNITY approach, using a database with about 130,000 entries and five different pharmacophore queries, was performed. The hybrid approach scored, on an average, 10-20% more hits than the reference run. Moreover, specific problems with unrealistic hit geometries produced by the original approach can be excluded. In addition, the influence of the maximum number of ring conformations per molecule was investigated. An optimal number of 10 conformations per molecule is recommended.

  20. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-04-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  1. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  2. Efficient Approach for Load Balancing in Virtual Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Harvinder singh

    2014-10-01

    Full Text Available Cloud computing technology is changing the focus of IT world and it is becoming famous because of its great characteristics. Load balancing is one of the main challenges in cloud computing for distributing workloads across multiple computers or a computer cluster, network links, central processing units, disk drives, or other resources. Successful load balancing optimizes resource use, maximizes throughput, minimizes response time, and avoids overload. The objective of this paper to propose an approach for scheduling algorithms that can maintain the load balancing and provides better improved strategies through efficient job scheduling and modified resource allocation techniques. The results discussed in this paper, based on existing round robin, least connection, throttled load balance, fastest response time and a new proposed algorithm fastest with least connection scheduling algorithms. This new algorithm identifies the overall response time and data centre processing time is improved as well as cost is reduced in comparison to the existing scheduling parameters.

  3. EFFICIENT APPROACH FOR LOAD BALANCING IN VIRTUAL CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Harvinder Singh

    2015-10-01

    Full Text Available Cloud computing technology is changing the focus of IT world and it is becoming famous because of its great characteristics. Load balancing is one of the main challenges in cloud computing for distributing workloads across multiple computers or a computer cluster, network links, central processing units, disk drives, or other resources. Successful load balancing optimizes resource use, maximizes throughput, minimizes response time, and avoids overload. The objective of this paper to propose an approach for scheduling algorithms that can maintain the load balancing and provides better improved strategies through efficient job scheduling and modified resource allocation techniques. The results discussed in this paper, based on existing round robin, least connection, throttled load balance, fastest response time and a new proposed algorithm fastest with least connection scheduling algorithms. This new algorithm identifies the overall response time and data centre processing time is improved as well as cost is reduced in comparison to the existing scheduling parameters.

  4. A GPU-Computing Approach to Solar Stokes Profile Inversion

    CERN Document Server

    Harker, Brian J

    2012-01-01

    We present a new computational approach to the inversion of solar photospheric Stokes polarization profiles, under the Milne-Eddington model, for vector magnetography. Our code, named GENESIS (GENEtic Stokes Inversion Strategy), employs multi-threaded parallel-processing techniques to harness the computing power of graphics processing units GPUs, along with algorithms designed to exploit the inherent parallelism of the Stokes inversion problem. Using a genetic algorithm (GA) engineered specifically for use with a GPU, we produce full-disc maps of the photospheric vector magnetic field from polarized spectral line observations recorded by the Synoptic Optical Long-term Investigations of the Sun (SOLIS) Vector Spectromagnetograph (VSM) instrument. We show the advantages of pairing a population-parallel genetic algorithm with data-parallel GPU-computing techniques, and present an overview of the Stokes inversion problem, including a description of our adaptation to the GPU-computing paradigm. Full-disc vector ma...

  5. Cloud Computing – A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  6. A hybrid approach to incorporating climate change and variability into climate scenario for impact assessments

    OpenAIRE

    Gebretsadik, Yohannes; Strzepek, Kenneth; Schlosser, C. Adam

    2014-01-01

    Traditional 'delta-change' approach of scenario generation for climate change impact assessment to water resources strongly depends on the selected base-case observed historical climate conditions that the climate shocks are to be super-imposed. This method disregards the combined effect of climate change and the inherent hydro-climatological variability in the system. Here we demonstrated a hybrid uncertainty approach in which uncertainties in historical climate variability are combined with...

  7. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  8. Computational intelligence approaches for pattern discovery in biological systems.

    Science.gov (United States)

    Fogel, Gary B

    2008-07-01

    Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.

  9. ANN Approach for State Estimation of Hybrid Systems and Its Experimental Validation

    Directory of Open Access Journals (Sweden)

    Shijoh Vellayikot

    2015-01-01

    Full Text Available A novel artificial neural network based state estimator has been proposed to ensure the robustness in the state estimation of autonomous switching hybrid systems under various uncertainties. Taking the autonomous switching three-tank system as benchmark hybrid model working under various additive and multiplicative uncertainties such as process noise, measurement error, process–model parameter variation, initial state mismatch, and hand valve faults, real-time performance evaluation by the comparison of it with other state estimators such as extended Kalman filter and unscented Kalman Filter was carried out. The experimental results reported with the proposed approach show considerable improvement in the robustness in performance under the considered uncertainties.

  10. A Hybrid Sensing Approach for Pure and Adulterated Honey Classification

    Directory of Open Access Journals (Sweden)

    Ammar Zakaria

    2012-10-01

    Full Text Available This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA and Principal Component Analysis (PCA statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose and Fourier Transform Infrared Spectroscopy (FTIR were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0% gave higher classification accuracy than e-nose data (76.5% using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data.

  11. Scripting approach in hybrid organic-inorganic condensation simulation: the GPTMS proof-of-concept

    OpenAIRE

    Maly, Marek; Posocco, Paola; Fermeglia, Maurizio; Pricl, Sabrina

    2008-01-01

    Abstract Silica-based hybrid organic-inorganic materials prepared by sol-gel chemistry exhibit unique chemical and physical properties by virtue of their anisotropic organization. (3-glycidoxypropyl)trimethoxysilane (GPTMS)-based networks represent an archetype of this class of substances, with a vast range of applications. In the present study, a new computational recipe has been developed within Materials Studio software platform to generate atomistic models of GPTMS crosslinked ...

  12. Neuromolecular computing: a new approach to human brain evolution.

    Science.gov (United States)

    Wallace, R; Price, H

    1999-09-01

    Evolutionary approaches in human cognitive neurobiology traditionally emphasize macroscopic structures. It may soon be possible to supplement these studies with models of human information-processing of the molecular level. Thin-film, simulation, fluorescence microscopy, and high-resolution X-ray crystallographic studies provide evidence for transiently organized neural membrane molecular systems with possible computational properties. This review article examines evidence for hydrophobic-mismatch molecular interactions within phospholipid microdomains of a neural membrane bilayer. It is proposed that these interactions are a massively parallel algorithm which can rapidly compute near-optimal solutions to complex cognitive and physiological problems. Coupling of microdomain activity to permenant ion movements at ligand-gated and voltage-gated channels permits the conversion of molecular computations into neuron frequency codes. Evidence for microdomain transport of proteins to specific locations within the bilayer suggests that neuromolecular computation may be under some genetic control and thus modifiable by natural selection. A possible experimental approach for examining evolutionary changes in neuromolecular computation is briefly discussed.

  13. Design of new phenothiazine-thiadiazole hybrids via molecular hybridization approach for the development of potent antitubercular agents.

    Science.gov (United States)

    Ramprasad, Jurupula; Nayak, Nagabhushana; Dalimba, Udayakumar

    2015-12-01

    A new library of phenothiazine and 1,3,4-thiadiazole hybrid derivatives (5a-u) was designed based on the molecular hybridization approach and the molecules were synthesized in excellent yields using a facile single-step chloro-amine coupling reaction between 2-chloro-1-(10H-phenothiazin-10-yl)ethanones and 2-amino-5-subsituted-1,3,4-thiadiazoles. The compounds were evaluated for their in vitro inhibition activity against Mycobacterium tuberculosis H37Rv (MTB). Compounds 5 g and 5 n were emerged as the most active compounds of the series with MIC of 0.8 μg/mL (∼ 1.9 μM). Also, compounds 5a, 5b, 5c, 5e, 5l and 5m (MIC = 1.6 μg/mL), and compounds 5j, 5k and 5o (MIC = 3.125 μg/mL) showed significant inhibition activity. The structure-activity relationship demonstrated that an alkyl (methyl/n-propyl) or substituted (4-methyl/4-Cl/4-F) phenyl groups on the 1,3,4-thiadiazole ring enhance the inhibition activity of the compounds. The cytotoxicity study revealed that none of the active molecules are toxic to a normal Vero cell line thus proving the lack of general cellular toxicity. Further, the active molecules were subjected to molecular docking studies with target enzymes InhA and CYP121.

  14. Coordinated Target Tracking via a Hybrid Optimization Approach

    Science.gov (United States)

    Wang, Yin; Cao, Yan

    2017-01-01

    Recent advances in computer science and electronics have greatly expanded the capabilities of unmanned aerial vehicles (UAV) in both defense and civil applications, such as moving ground object tracking. Due to the uncertainties of the application environments and objects’ motion, it is difficult to maintain the tracked object always within the sensor coverage area by using a single UAV. Hence, it is necessary to deploy a group of UAVs to improve the robustness of the tracking. This paper investigates the problem of tracking ground moving objects with a group of UAVs using gimbaled sensors under flight dynamic and collision-free constraints. The optimal cooperative tracking path planning problem is solved using an evolutionary optimization technique based on the framework of chemical reaction optimization (CRO). The efficiency of the proposed method was demonstrated through a series of comparative simulations. The results show that the cooperative tracking paths determined by the newly developed method allows for longer sensor coverage time under flight dynamic restrictions and safety conditions. PMID:28264425

  15. Coordinated Target Tracking via a Hybrid Optimization Approach

    Directory of Open Access Journals (Sweden)

    Yin Wang

    2017-02-01

    Full Text Available Recent advances in computer science and electronics have greatly expanded the capabilities of unmanned aerial vehicles (UAV in both defense and civil applications, such as moving ground object tracking. Due to the uncertainties of the application environments and objects’ motion, it is difficult to maintain the tracked object always within the sensor coverage area by using a single UAV. Hence, it is necessary to deploy a group of UAVs to improve the robustness of the tracking. This paper investigates the problem of tracking ground moving objects with a group of UAVs using gimbaled sensors under flight dynamic and collision-free constraints. The optimal cooperative tracking path planning problem is solved using an evolutionary optimization technique based on the framework of chemical reaction optimization (CRO. The efficiency of the proposed method was demonstrated through a series of comparative simulations. The results show that the cooperative tracking paths determined by the newly developed method allows for longer sensor coverage time under flight dynamic restrictions and safety conditions.

  16. One approach for evaluating the Distributed Computing Design System (DCDS)

    Science.gov (United States)

    Ellis, J. T.

    1985-01-01

    The Distributed Computer Design System (DCDS) provides an integrated environment to support the life cycle of developing real-time distributed computing systems. The primary focus of DCDS is to significantly increase system reliability and software development productivity, and to minimize schedule and cost risk. DCDS consists of integrated methodologies, languages, and tools to support the life cycle of developing distributed software and systems. Smooth and well-defined transistions from phase to phase, language to language, and tool to tool provide a unique and unified environment. An approach to evaluating DCDS highlights its benefits.

  17. An evolutionary computational approach for the dynamic Stackelberg competition problems

    Directory of Open Access Journals (Sweden)

    Lorena Arboleda-Castro

    2016-06-01

    Full Text Available Stackelberg competition models are an important family of economical decision problems from game theory, in which the main goal is to find optimal strategies between two competitors taking into account their hierarchy relationship. Although these models have been widely studied in the past, it is important to note that very few works deal with uncertainty scenarios, especially those that vary over time. In this regard, the present research studies this topic and proposes a computational method for solving efficiently dynamic Stackelberg competition models. The computational experiments suggest that the proposed approach is effective for problems of this nature.

  18. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    Science.gov (United States)

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  19. A HYBRID APPROACH BASED MEDICAL IMAGE RETRIEVAL SYSTEM USING FEATURE OPTIMIZED CLASSIFICATION SIMILARITY FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Yogapriya Jaganathan

    2013-01-01

    Full Text Available For the past few years, massive upgradation is obtained in the pasture of Content Based Medical Image Retrieval (CBMIR for effective utilization of medical images based on visual feature analysis for the purpose of diagnosis and educational research. The existing medical image retrieval systems are still not optimal to solve the feature dimensionality reduction problem which increases the computational complexity and decreases the speed of a retrieval process. The proposed CBMIR is used a hybrid approach based on Feature Extraction, Optimization of Feature Vectors, Classification of Features and Similarity Measurements. This type of CBMIR is called Feature Optimized Classification Similarity (FOCS framework. The selected features are Textures using Gray level Co-occurrence Matrix Features (GLCM and Tamura Features (TF in which extracted features are formed as feature vector database. The Fuzzy based Particle Swarm Optimization (FPSO technique is used to reduce the feature vector dimensionality and classification is performed using Fuzzy based Relevance Vector Machine (FRVM to form groups of relevant image features that provide a natural way to classify dimensionally reduced feature vectors of images. The Euclidean Distance (ED is used as similarity measurement to measure the significance between the query image and the target images. This FOCS approach can get the query from the user and has retrieved the needed images from the databases. The retrieval algorithm performances are estimated in terms of precision and recall. This FOCS framework comprises several benefits when compared to existing CBMIR. GLCM and TF are used to extract texture features and form a feature vector database. Fuzzy-PSO is used to reduce the feature vector dimensionality issues while selecting the important features in the feature vector database in which computational complexity is decreased. Fuzzy based RVM is used for feature classification in which it increases the

  20. Efficient magnetohydrodynamic simulations on distributed multi-GPU systems using a novel GPU Direct-MPI hybrid approach

    Science.gov (United States)

    Wong, Un-Hong; Aoki, Takayuki; Wong, Hon-Cheng

    2014-07-01

    Modern graphics processing units (GPUs) have been widely utilized in magnetohydrodynamic (MHD) simulations in recent years. Due to the limited memory of a single GPU, distributed multi-GPU systems are needed to be explored for large-scale MHD simulations. However, the data transfer between GPUs bottlenecks the efficiency of the simulations on such systems. In this paper we propose a novel GPU Direct-MPI hybrid approach to address this problem for overall performance enhancement. Our approach consists of two strategies: (1) We exploit GPU Direct 2.0 to speedup the data transfers between multiple GPUs in a single node and reduce the total number of message passing interface (MPI) communications; (2) We design Compute Unified Device Architecture (CUDA) kernels instead of using memory copy to speedup the fragmented data exchange in the three-dimensional (3D) decomposition. 3D decomposition is usually not preferable for distributed multi-GPU systems due to its low efficiency of the fragmented data exchange. Our approach has made a breakthrough to make 3D decomposition available on distributed multi-GPU systems. As a result, it can reduce the memory usage and computation time of each partition of the computational domain. Experiment results show twice the FLOPS comparing to common 2D decomposition MPI-only implementation method. The proposed approach has been developed in an efficient implementation for MHD simulations on distributed multi-GPU systems, called MGPU-MHD code. The code realizes the GPU parallelization of a total variation diminishing (TVD) algorithm for solving the multidimensional ideal MHD equations, extending our work from single GPU computation (Wong et al., 2011) to multiple GPUs. Numerical tests and performance measurements are conducted on the TSUBAME 2.0 supercomputer at the Tokyo Institute of Technology. Our code achieves 2 TFLOPS in double precision for the problem with 12003 grid points using 216 GPUs.

  1. Minimally invasive treatment of the thoracic spine disease: completely percutaneous and hybrid approaches.

    Science.gov (United States)

    Tamburrelli, Francesco Ciro; Francesco Ciro, Tamburrelli; Scaramuzzo, Laura; Laura, Scaramuzzo; Genitiempo, Maurizio; Maurizio, Genitiempo; Proietti, Luca; Luca, Proietti

    2013-01-01

    The aim of the study was to evaluate the feasibility of a limited invasive approach for the treatment of upper thoracic spine disease. Seven patients with type-A thoracic fractures and three with tumors underwent long thoracic stabilization through a minimally invasive approach. Four patients underwent a completely percutaneous approach while the other three underwent a modified hybrid technique, a combination of percutaneous and open approach. The hybrid constructs were realized using a percutaneous approach to the spine distally to the spinal lesion and by open approach proximally. In two patients, the stabilization was extended proximally up to the cervical spine. Clinical and radiographic assessment was performed during the first year after the operation at 3, 6, and 12 months. No technically related complications were seen. The postoperative recovery was rapid even in the tumor patients with neurologic impairment. Blood loss was irrelevant. At one-year follow-up there was no loosening or breakage of the screws or failure of the implants. When technically feasible a completely percutaneous approach has to be taken in consideration; otherwise, a combined open-percutaneous approach could be planned to minimize the invasivity of a completely open approach to the thoracic spine.

  2. Hybrid approach for left-sided colonic carcinoma obstruction; a case report

    Directory of Open Access Journals (Sweden)

    Chinswangwatanakul Vitoon

    2011-04-01

    Full Text Available Abstract Traditionally, there are several approaches to manage left-sided colonic carcinoma obstruction, such as tumor resection with primary anastomosis, tumor resection with end-colostomy and loop-colostomy. Recently, colonic stent insertion was introduced as a bridge prior to definite surgery. We demonstrated a hybrid approach for obstructed sigmoid carcinoma using colonic stent, followed by single incision laparoscopic colectomy (SILC. A 58 year-old man presented with complete left-sided colonic obstruction. He underwent emergency colonoscopy with metallic stent placement. One week later, he was performed SILC. He recovered well after the operation without any postoperative complications. The pathological result showed adequacy of oncologic resection. This hybrid approach of colonic stent insertion and SILC can be safely performed.

  3. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

    1993-02-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  4. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

    1993-01-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  5. Efficient Adjoint Computation of Hybrid Systems of Differential Algebraic Equations with Applications in Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Abhyankar, Shrirang [Argonne National Lab. (ANL), Argonne, IL (United States); Anitescu, Mihai [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil [Argonne National Lab. (ANL), Argonne, IL (United States); Zhang, Hong [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-31

    Sensitivity analysis is an important tool to describe power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this work, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating trajectory sensitivities of larger systems and is consistent, within machine precision, with the function whose sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as DC exciters, by deriving and implementing the adjoint jump conditions that arise from state and time-dependent discontinuities. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach.

  6. Grouping Based Job Scheduling Algorithm Using Priority Queue and Hybrid Algorithm in Grid Computing

    Directory of Open Access Journals (Sweden)

    Pinky Rosemarry

    2013-01-01

    Full Text Available Grid computing enlarge with computing platform which is collection of heterogeneous computing resources connected by a network across dynamic and geographically dispersed organization to form a distributed high performance computing infrastructure. Grid computing solves the complex computing problems amongst multiple machines. Grid computing solves the large scale computational demands in a high performance computing environment. The main emphasis in the grid computing is given to the resource management and the job scheduler .The goal of the job scheduler is to maximize the resource utilization and minimize the processing time of the jobs. Existing approaches of Grid scheduling doesn’t give much emphasis on the performance of a Grid scheduler in processing time parameter. Schedulers allocate resources to the jobs to be executed using the First come First serve algorithm. In this paper, we have provided an optimize algorithm to queue of the scheduler using various scheduling methods like Shortest Job First, First in First out, Round robin. The job scheduling system is responsible to select best suitable machines in a grid for user jobs. The management and scheduling system generates job schedules for each machine in the grid by taking static restrictions and dynamic parameters of jobs and machinesinto consideration. The main purpose of this paper is to develop an efficient job scheduling algorithm to maximize the resource utilization and minimize processing time of the jobs. Queues can be optimized byusing various scheduling algorithms depending upon the performance criteria to be improved e.g. response time, throughput. The work has been done in MATLAB using the parallel computing toolbox.

  7. Resource-Efficient, Hierarchical Auto-Tuning of a Hybrid Lattice Boltzmann Computation on the Cray XT4

    OpenAIRE

    Williams, Samuel; Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720, USA; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Irvine, CA

    2009-01-01

    We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to con...

  8. New MPPT algorithm for PV applications based on hybrid dynamical approach

    KAUST Repository

    Elmetennani, S.

    2016-10-24

    This paper proposes a new Maximum Power Point Tracking (MPPT) algorithm for photovoltaic applications using the multicellular converter as a stage of power adaptation. The proposed MPPT technique has been designed using a hybrid dynamical approach to model the photovoltaic generator. The hybrid dynamical theory has been applied taking advantage of the particular topology of the multicellular converter. Then, a hybrid automata has been established to optimize the power production. The maximization of the produced solar energy is achieved by switching between the different operative modes of the hybrid automata, which is conditioned by some invariance and transition conditions. These conditions have been validated by simulation tests under different conditions of temperature and irradiance. Moreover, the performance of the proposed algorithm has been then evaluated by comparison with standard MPPT techniques numerically and by experimental tests under varying external working conditions. The results have shown the interesting features that the hybrid MPPT technique presents in terms of performance and simplicity for real time implementation.

  9. Modelling biochemical networks with intrinsic time delays: a hybrid semi-parametric approach

    Directory of Open Access Journals (Sweden)

    Oliveira Rui

    2010-09-01

    Full Text Available Abstract Background This paper presents a method for modelling dynamical biochemical networks with intrinsic time delays. Since the fundamental mechanisms leading to such delays are many times unknown, non conventional modelling approaches become necessary. Herein, a hybrid semi-parametric identification methodology is proposed in which discrete time series are incorporated into fundamental material balance models. This integration results in hybrid delay differential equations which can be applied to identify unknown cellular dynamics. Results The proposed hybrid modelling methodology was evaluated using two case studies. The first of these deals with dynamic modelling of transcriptional factor A in mammalian cells. The protein transport from the cytosol to the nucleus introduced a delay that was accounted for by discrete time series formulation. The second case study focused on a simple network with distributed time delays that demonstrated that the discrete time delay formalism has broad applicability to both discrete and distributed delay problems. Conclusions Significantly better prediction qualities of the novel hybrid model were obtained when compared to dynamical structures without time delays, being the more distinctive the more significant the underlying system delay is. The identification of the system delays by studies of different discrete modelling delays was enabled by the proposed structure. Further, it was shown that the hybrid discrete delay methodology is not limited to discrete delay systems. The proposed method is a powerful tool to identify time delays in ill-defined biochemical networks.

  10. A hybrid approach to device integration on a genetic analysis platform

    Science.gov (United States)

    Brennan, Des; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Justice, John; Aherne, Margaret; Macek, Milan; Galvin, Paul

    2012-10-01

    Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization.

  11. An improved yeast two-hybrid approach for detection of interacting proteins

    Institute of Scientific and Technical Information of China (English)

    Wan Bingbing; Shi Yan; Huo Keke

    2006-01-01

    Yeast two-hybrid approach is popularly used nowadays as an important technical method in the field of studying protein-protein interactions.Although yeast two-hybrid system is obviously advantageous in searching interacting proteins and setting up the network of proteins interaction.not all of proteins can use routine yeast two-hybrid method to search interacting proteins.Many important proteins,such as some nucleoprotein transcriptional factor,carry out the regular method and construct the bait-BD vector to screen the library containing AD vector.However,it usually results in failures because it contains the activate domain and can self-activate the reporter gene.In this study,we changed the research strategy,fused the bait gene(FOXA3)with the AD vector to screen the library containing BD vector,so that we constructed a two-hybrid library containing BD vector and Can bypass the interference of self-activation.And we used this two-hybrid library to screen FOXA3.a hepatocyte nuclear factor,and found out an interacting protein:complement component C3.

  12. A bottom-up approach for the synthesis of highly ordered fullerene-intercalated graphene hybrids

    Directory of Open Access Journals (Sweden)

    Dimitrios eGournis

    2015-02-01

    Full Text Available Much of the research effort on graphene focuses on its use as a building block for the development of new hybrid nanostructures with well-defined dimensions and properties suitable for applications such as gas storage, heterogeneous catalysis, gas/liquid separations, nanosensing and biomedicine. Towards this aim, here we describe a new bottom-up approach, which combines self-assembly with the Langmuir Schaefer deposition technique to synthesize graphene-based layered hybrid materials hosting fullerene molecules within the interlayer space. Our film preparation consists in a bottom-up layer-by-layer process that proceeds via the formation of a hybrid organo-graphene oxide Langmuir film. The structure and composition of these hybrid fullerene-containing thin multilayers deposited on hydrophobic substrates were characterized by a combination of X-ray diffraction, Raman and X-ray photoelectron spectroscopies, atomic force microscopy and conductivity measurements. The latter revealed that the presence of C60 within the interlayer spacing leads to an increase in electrical conductivity of the hybrid material as compared to the organo-graphene matrix alone.

  13. Parallel Computing Characteristics of CUPID code under MPI and Hybrid environment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeon, Byoung Jin; Choi, Hyoung Gwon [Seoul National Univ. of Science and Technology, Seoul (Korea, Republic of)

    2014-05-15

    In this paper, a characteristic of parallel algorithm is presented for solving an elliptic type equation of CUPID via domain decomposition method using the MPI and the parallel performance is estimated in terms of a scalability which shows the speedup ratio. In addition, the time-consuming pattern of major subroutines is studied. Two different grid systems are taken into account: 40,000 meshes for coarse system and 320,000 meshes for fine system. Since the matrix of the CUPID code differs according to whether the flow is single-phase or two-phase, the effect of matrix shape is evaluated. Finally, the effect of the preconditioner for matrix solver is also investigated. Finally, the hybrid (OpenMP+MPI) parallel algorithm is introduced and discussed in detail for solving pressure solver. Component-scale thermal-hydraulics code, CUPID has been developed for two-phase flow analysis, which adopts a three-dimensional, transient, three-field model, and parallelized to fulfill a recent demand for long-transient and highly resolved multi-phase flow behavior. In this study, the parallel performance of the CUPID code was investigated in terms of scalability. The CUPID code was parallelized with domain decomposition method. The MPI library was adopted to communicate the information at the neighboring domain. For managing the sparse matrix effectively, the CSR storage format is used. To take into account the characteristics of the pressure matrix which turns to be asymmetric for two-phase flow, both single-phase and two-phase calculations were run. In addition, the effect of the matrix size and preconditioning was also investigated. The fine mesh calculation shows better scalability than the coarse mesh because the number of coarse mesh does not need to decompose the computational domain excessively. The fine mesh can be present good scalability when dividing geometry with considering the ratio between computation and communication time. For a given mesh, single-phase flow

  14. Near-term hybrid vehicle program, phase 1. Appendix B: Design trade-off studies report. Volume 3: Computer program listings

    Science.gov (United States)

    1979-01-01

    A description and listing is presented of two computer programs: Hybrid Vehicle Design Program (HYVELD) and Hybrid Vehicle Simulation Program (HYVEC). Both of the programs are modifications and extensions of similar programs developed as part of the Electric and Hybrid Vehicle System Research and Development Project.

  15. Computational Approach for Multi Performances Optimization of EDM

    Directory of Open Access Journals (Sweden)

    Yusoff Yusliza

    2016-01-01

    Full Text Available This paper proposes a new computational approach employed in obtaining optimal parameters of multi performances EDM. Regression and artificial neural network (ANN are used as the modeling techniques meanwhile multi objective genetic algorithm (multiGA is used as the optimization technique. Orthogonal array L256 is implemented in the procedure of network function and network architecture selection. Experimental studies are carried out to verify the machining performances suggested by this approach. The highest MRR value obtained from OrthoANN – MPR – MultiGA is 205.619 mg/min and the lowest Ra value is 0.0223μm.

  16. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    Directory of Open Access Journals (Sweden)

    Lukas Falat

    2016-01-01

    Full Text Available This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  17. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    Science.gov (United States)

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  18. Requirements for Control Room Computer-Based Procedures for use in Hybrid Control Rooms

    Energy Technology Data Exchange (ETDEWEB)

    Le Blanc, Katya Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    Many plants in the U.S. are currently undergoing control room modernization. The main drivers for modernization are the aging and obsolescence of existing equipment, which typically results in a like-for-like replacement of analogue equipment with digital systems. However, the modernization efforts present an opportunity to employ advanced technology that would not only extend the life, but enhance the efficiency and cost competitiveness of nuclear power. Computer-based procedures (CBPs) are one example of near-term advanced technology that may provide enhanced efficiencies above and beyond like for like replacements of analog systems. Researchers in the LWRS program are investigating the benefits of advanced technologies such as CBPs, with the goal of assisting utilities in decision making during modernization projects. This report will describe the existing research on CBPs, discuss the unique issues related to using CBPs in hybrid control rooms (i.e., partially modernized analog control rooms), and define the requirements of CBPs for hybrid control rooms.

  19. Optimization of a Continuous Hybrid Impeller Mixer via Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    N. Othman

    2014-01-01

    Full Text Available This paper presents the preliminary steps required for conducting experiments to obtain the optimal operating conditions of a hybrid impeller mixer and to determine the residence time distribution (RTD using computational fluid dynamics (CFD. In this paper, impeller speed and clearance parameters are examined. The hybrid impeller mixer consists of a single Rushton turbine mounted above a single pitched blade turbine (PBT. Four impeller speeds, 50, 100, 150, and 200 rpm, and four impeller clearances, 25, 50, 75, and 100 mm, were the operation variables used in this study. CFD was utilized to initially screen the parameter ranges to reduce the number of actual experiments needed. Afterward, the residence time distribution (RTD was determined using the respective parameters. Finally, the Fluent-predicted RTD and the experimentally measured RTD were compared. The CFD investigations revealed that an impeller speed of 50 rpm and an impeller clearance of 25 mm were not viable for experimental investigations and were thus eliminated from further analyses. The determination of RTD using a k-ε turbulence model was performed using CFD techniques. The multiple reference frame (MRF was implemented and a steady state was initially achieved followed by a transient condition for RTD determination.

  20. SWNT-DNA and SWNT-polyC hybrids: AFM study and computer modeling.

    Science.gov (United States)

    Karachevtsev, M V; Lytvyn, O S; Stepanian, S G; Leontiev, V S; Adamowicz, L; Karachevtsev, V A

    2008-03-01

    Hybrids of carbon single-walled nanotubes (SWNT) with fragmented single or double-stranded DNA (fss- or fds-DNA) or polyC were studied by Atom Force Microscopy (AFM) and computer modeling. It was found that fragments of the polymer wrap in several layers around the nanotube, forming a strand-like spindle. In contrast to the fss-DNA, the fds-DNA also forms compact structures near the tube surface due to the formation of self-assembly structures consisting of a few DNA fragments. The hybrids of SWNT with wrapped single-, double- or triple strands of the biopolymer were simulated, and it was shown that such structures are stable. To explain the reason of multi-layer polymeric coating of the nanotube surface, the energy of the intermolecular interactions between different components of polyC was calculated at the MP2/6-31++G** level as well as the interaction energy in the SWNT-cytosine complex.

  1. Feasibility of a Hybrid Brain-Computer Interface for Advanced Functional Electrical Therapy

    Directory of Open Access Journals (Sweden)

    Andrej M. Savić

    2014-01-01

    Full Text Available We present a feasibility study of a novel hybrid brain-computer interface (BCI system for advanced functional electrical therapy (FET of grasp. FET procedure is improved with both automated stimulation pattern selection and stimulation triggering. The proposed hybrid BCI comprises the two BCI control signals: steady-state visual evoked potentials (SSVEP and event-related desynchronization (ERD. The sequence of the two stages, SSVEP-BCI and ERD-BCI, runs in a closed-loop architecture. The first stage, SSVEP-BCI, acts as a selector of electrical stimulation pattern that corresponds to one of the three basic types of grasp: palmar, lateral, or precision. In the second stage, ERD-BCI operates as a brain switch which activates the stimulation pattern selected in the previous stage. The system was tested in 6 healthy subjects who were all able to control the device with accuracy in a range of 0.64–0.96. The results provided the reference data needed for the planned clinical study. This novel BCI may promote further restoration of the impaired motor function by closing the loop between the “will to move” and contingent temporally synchronized sensory feedback.

  2. Hybrid approach in a difficult case of pseudoaneurysm of right common carotid artery.

    Science.gov (United States)

    Kumar, Dilip; Chakraborty, Saujatya; Banerjee, Sunip

    2015-12-01

    We present the case of a 65-year-old gentleman, who presented with a symptomatic pseudoaneurysm of the right common carotid artery. Because of high surgical risk, endovascular approach was decided upon. However, taking hardware across the lesion via the aortic arch provided us with insurmountable difficulties. Therefore, a hybrid approach was resorted to, in which an arteriotomy was done in the carotid artery followed by direct implantation of the stent. We were thus able to create a favorable trade-off between the high surgical risk of a full surgical procedure and the peri-operative benefit of an endovascular approach.

  3. Hybrid phase retrieval approach for reconstruction of in-line digital holograms without twin image

    Science.gov (United States)

    Zhao, Jie; Wang, Dayong; Zhang, Fucai; Wang, Yunxin

    2011-09-01

    A hybrid phase retrieval approach is proposed to address the twin image problem in the reconstruction of in-line digital holograms. The approach is a variant iterative transform algorithm and exploits two mostly natural constraints of a sample, namely, the finite transmission and the finite support. Here, the initial sample support estimate is first refined by applying the finite transmission constraint with phase flipping. The approach provides better reconstruction than if only the finite transmission constraint is used and improve the convergence rate of Fienup's algorithm owing to a better estimate of support especially for strong samples with complex structures. Both simulation and experimental results are presented.

  4. Computer Mechatronics: A Radical Approach to Mechatronics Education

    OpenAIRE

    Nilsson, Martin

    2005-01-01

    This paper describes some distinguishing features of a course on mechatronics, based on computer science. We propose a teaching approach called Controlled Problem-Based Learning (CPBL). We have applied this method on three generations (2003-2005) of mainly fourth-year undergraduate students at Lund University (LTH). Although students found the course difficult, there were no dropouts, and all students attended the examination 2005.

  5. COMPTEL skymapping: a new approach using parallel computing

    OpenAIRE

    Strong, A.W.; Bloemen, H.; Diehl, R.; Hermsen, W.; Schoenfelder, V.

    1998-01-01

    Large-scale skymapping with COMPTEL using the full survey database presents challenging problems on account of the complex response and time-variable background. A new approach which attempts to address some of these problems is described, in which the information about each observation is preserved throughout the analysis. In this method, a maximum-entropy algorithm is used to determine image and background simultaneously. Because of the extreme computing requirements, the method has been im...

  6. Review: the physiological and computational approaches for atherosclerosis treatment.

    Science.gov (United States)

    Wang, Wuchen; Lee, Yugyung; Lee, Chi H

    2013-09-01

    The cardiovascular disease has long been an issue that causes severe loss in population, especially those conditions associated with arterial malfunction, being attributable to atherosclerosis and subsequent thrombotic formation. This article reviews the physiological mechanisms that underline the transition from plaque formation in atherosclerotic process to platelet aggregation and eventually thrombosis. The physiological and computational approaches, such as percutaneous coronary intervention and stent design modeling, to detect, evaluate and mitigate this malicious progression were also discussed.

  7. A spline-based approach for computing spatial impulse responses.

    Science.gov (United States)

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  8. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  9. Single-Board-Computer-Based Traffic Generator for a Heterogeneous and Hybrid Smart Grid Communication Network

    Directory of Open Access Journals (Sweden)

    Do Nguyet Quang

    2014-02-01

    Full Text Available In smart grid communication implementation, network traffic pattern is one of the main factors that affect the system’s performance. Examining different traffic patterns in smart grid is therefore crucial when analyzing the network performance. Due to the heterogeneous and hybrid nature of smart grid, the type of traffic distribution in the network is still unknown. The traffic that popularly used for simulation and analysis no longer reflects the real traffic in a multi-technology and bi-directional communication system. Hence, in this study, a single-board computer is implemented as a traffic generator which can generate network traffic similar to those generated by various applications in the fully operational smart grid. By placing in a strategic and appropriate position, a collection of traffic generators allow network administrators to investigate and test the effect of heavy traffic on performance of smart grid communication system.

  10. Microwave-irradiation-assisted hybrid chemical approach for titanium dioxide nanoparticle synthesis: microbial and cytotoxicological evaluation.

    Science.gov (United States)

    Ranjan, Shivendu; Dasgupta, Nandita; Rajendran, Bhavapriya; Avadhani, Ganesh S; Ramalingam, Chidambaram; Kumar, Ashutosh

    2016-06-01

    Titanium dioxide nanoparticles (TNPs) are widely used in the pharmaceutical and cosmetics industries. It is used for protection against UV exposure due to its light-scattering properties and high refractive index. Though TNPs are increasingly used, the synthesis of TNPs is tedious and time consuming; therefore, in the present study, microwave-assisted hybrid chemical approach was used for TNP synthesis. In the present study, we demonstrated that TNPs can be synthesized only in 2.5 h; however, the commonly used chemical approach using muffle furnace takes 5 h. The activity of TNP depends on the synthetic protocol; therefore, the present study also determined the effect of microwave-assisted hybrid chemical approach synthetic protocol on microbial and cytotoxicity. The results showed that TNP has the best antibacterial activity in decreasing order from Escherichia coli, Bacillus subtilis, and Staphylococcus aureus. The IC50 values of TNP for HCT116 and A549 were found to be 6.43 and 6.04 ppm, respectively. Cell death was also confirmed from trypan blue exclusion assay and membrane integrity loss was observed. Therefore, the study determines that the microwave-assisted hybrid chemical approach is time-saving; hence, this technique can be upgraded from lab scale to industrial scale via pilot plant scale. Moreover, it is necessary to find the mechanism of action at the molecular level to establish the reason for greater bacterial and cytotoxicological toxicity. Graphical abstract A graphical representation of TNP synthesis.

  11. Urgent hybrid approach in treatment of the acute myocardial infarction complicated by the ventricular septal rupture

    Directory of Open Access Journals (Sweden)

    Radosavljević-Radovanović Mina

    2014-01-01

    Full Text Available Introduction. Ventricular septal rupture (VSR in the acute myocardial infarction (AMI is a rare but very serious complication, still associated with high mortality, despite significant improvements in pharmacological and surgical treatment. Therefore, hybrid approaches are introduced as new therapeutical options. Case Outline. We present an urgent hybrid approach, consisting of the initial percutaneous coronary intervention (PCI of the infarct-related artery, followed by immediate surgical closure of the ventricular septal rupture, for treatment of high risk, hemodynamically unstable female patient with AMI caused by one-vessel disease and complicated by VSR and cardiogenic shock. Since the operative risk was also very high (EUROSCORE II 37%, this therapeutic decision was based on the assumption that preoperative PCI could promptly establish blood flow and thereby lessen the risks, duration and complexity of urgent cardiosurgical intervention, performed on the same day. This approach proved to be successful and the patient was discharged from the hospital on the fifteenth postoperative day in stable condition. Conclusion. In selected cases, with high operative risk and unstable hemodynamic state due to AMI complicated by VSR, urgent hybrid approach consisting of the initial PCI followed by surgical closure of VSR may represent an acceptable treatment option and contribute to the treatment of this complex group of patients.

  12. An Efficient Framework for EEG Analysis with Application to Hybrid Brain Computer Interfaces Based on Motor Imagery and P300

    Directory of Open Access Journals (Sweden)

    Jinyi Long

    2017-01-01

    Full Text Available The hybrid brain computer interface (BCI based on motor imagery (MI and P300 has been a preferred strategy aiming to improve the detection performance through combining the features of each. However, current methods used for combining these two modalities optimize them separately, which does not result in optimal performance. Here, we present an efficient framework to optimize them together by concatenating the features of MI and P300 in a block diagonal form. Then a linear classifier under a dual spectral norm regularizer is applied to the combined features. Under this framework, the hybrid features of MI and P300 can be learned, selected, and combined together directly. Experimental results on the data set of hybrid BCI based on MI and P300 are provided to illustrate competitive performance of the proposed method against other conventional methods. This provides an evidence that the method used here contributes to the discrimination performance of the brain state in hybrid BCI.

  13. An Efficient Framework for EEG Analysis with Application to Hybrid Brain Computer Interfaces Based on Motor Imagery and P300

    Science.gov (United States)

    Wang, Jue; Yu, Tianyou

    2017-01-01

    The hybrid brain computer interface (BCI) based on motor imagery (MI) and P300 has been a preferred strategy aiming to improve the detection performance through combining the features of each. However, current methods used for combining these two modalities optimize them separately, which does not result in optimal performance. Here, we present an efficient framework to optimize them together by concatenating the features of MI and P300 in a block diagonal form. Then a linear classifier under a dual spectral norm regularizer is applied to the combined features. Under this framework, the hybrid features of MI and P300 can be learned, selected, and combined together directly. Experimental results on the data set of hybrid BCI based on MI and P300 are provided to illustrate competitive performance of the proposed method against other conventional methods. This provides an evidence that the method used here contributes to the discrimination performance of the brain state in hybrid BCI. PMID:28316617

  14. A hybrid method for the computation of quasi-3D seismograms.

    Science.gov (United States)

    Masson, Yder; Romanowicz, Barbara

    2013-04-01

    The development of powerful computer clusters and efficient numerical computation methods, such as the Spectral Element Method (SEM) made possible the computation of seismic wave propagation in a heterogeneous 3D earth. However, the cost of theses computations is still problematic for global scale tomography that requires hundreds of such simulations. Part of the ongoing research effort is dedicated to the development of faster modeling methods based on the spectral element method. Capdeville et al. (2002) proposed to couple SEM simulations with normal modes calculation (C-SEM). Nissen-Meyer et al. (2007) used 2D SEM simulations to compute 3D seismograms in a 1D earth model. Thanks to these developments, and for the first time, Lekic et al. (2011) developed a 3D global model of the upper mantle using SEM simulations. At the local and continental scale, adjoint tomography that is using a lot of SEM simulation can be implemented on current computers (Tape, Liu et al. 2009). Due to their smaller size, these models offer higher resolution. They provide us with images of the crust and the upper part of the mantle. In an attempt to teleport such local adjoint tomographic inversions into the deep earth, we are developing a hybrid method where SEM computation are limited to a region of interest within the earth. That region can have an arbitrary shape and size. Outside this region, the seismic wavefield is extrapolated to obtain synthetic data at the Earth's surface. A key feature of the method is the use of a time reversal mirror to inject the wavefield induced by distant seismic source into the region of interest (Robertsson and Chapman 2000). We compute synthetic seismograms as follow: Inside the region of interest, we are using regional spectral element software RegSEM to compute wave propagation in 3D. Outside this region, the wavefield is extrapolated to the surface by convolution with the Green's functions from the mirror to the seismic stations. For now, these

  15. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

    Science.gov (United States)

    McMullen, David P; Hotson, Guy; Katyal, Kapil D; Wester, Brock A; Fifer, Matthew S; McGee, Timothy G; Harris, Andrew; Johannes, Matthew S; Vogelstein, R Jacob; Ravitz, Alan D; Anderson, William S; Thakor, Nitish V; Crone, Nathan E

    2014-07-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

  16. A computational language approach to modeling prose recall in schizophrenia.

    Science.gov (United States)

    Rosenstein, Mark; Diaz-Asper, Catherine; Foltz, Peter W; Elvevåg, Brita

    2014-06-01

    Many cortical disorders are associated with memory problems. In schizophrenia, verbal memory deficits are a hallmark feature. However, the exact nature of this deficit remains elusive. Modeling aspects of language features used in memory recall have the potential to provide means for measuring these verbal processes. We employ computational language approaches to assess time-varying semantic and sequential properties of prose recall at various retrieval intervals (immediate, 30 min and 24 h later) in patients with schizophrenia, unaffected siblings and healthy unrelated control participants. First, we model the recall data to quantify the degradation of performance with increasing retrieval interval and the effect of diagnosis (i.e., group membership) on performance. Next we model the human scoring of recall performance using an n-gram language sequence technique, and then with a semantic feature based on Latent Semantic Analysis. These models show that automated analyses of the recalls can produce scores that accurately mimic human scoring. The final analysis addresses the validity of this approach by ascertaining the ability to predict group membership from models built on the two classes of language features. Taken individually, the semantic feature is most predictive, while a model combining the features improves accuracy of group membership prediction slightly above the semantic feature alone as well as over the human rating approach. We discuss the implications for cognitive neuroscience of such a computational approach in exploring the mechanisms of prose recall.

  17. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  18. a Holistic Approach for Inspection of Civil Infrastructures Based on Computer Vision Techniques

    Science.gov (United States)

    Stentoumis, C.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.

    2016-06-01

    In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  19. An FEM-Based State Estimation Approach to Nonlinear Hybrid Positioning Systems

    Directory of Open Access Journals (Sweden)

    Yu-Xin Zhao

    2013-01-01

    Full Text Available For hybrid positioning systems (HPSs, the estimator design is a crucial and important problem. In this paper, a finite-element-method- (FEM- based state estimation approach is proposed to HPS. As the weak solution of hybrid stochastic differential model is denoted by the Kolmogorov's forward equation, this paper constructs its interpolating point through the classical fourth-order Runge-Kutta method. Then, it approaches the solution with biquadratic interpolation function to obtain a prior probability density function of the state. A posterior probability density function is gained through Bayesian formula finally. In theory, the proposed scheme has more advantages in the performance of complexity and convergence for low-dimensional systems. By taking an illustrative example, numerical experiment results show that the new state estimator is feasible and has good performance than PF and UKF.

  20. Modelling the creep behaviour of tempered martensitic steel based on a hybrid approach

    Energy Technology Data Exchange (ETDEWEB)

    Yadav, Surya Deo, E-mail: surya.yadav@tugraz.at [Institute of Materials Science and Welding, Graz University of Technology, Kopernikusgasse 24, A-8010 Graz (Austria); Sonderegger, Bernhard, E-mail: bernhard.sonderegger@tugraz.at [Institute of Materials Science and Welding, Graz University of Technology, Kopernikusgasse 24, A-8010 Graz (Austria); Stracey, Muhammad, E-mail: strmuh001@myuct.ac.za [Centre for Materials Engineering, Department of Mechanical Engineering, University of Cape Town, Cape Town (South Africa); Poletti, Cecilia, E-mail: cecilia.poletti@tugraz.at [Institute of Materials Science and Welding, Graz University of Technology, Kopernikusgasse 24, A-8010 Graz (Austria)

    2016-04-26

    In this work, we present a novel hybrid approach to describe and model the creep behaviour of tempered martensitic steels. The hybrid approach couples a physically based model with a continuum damage mechanics (CDM) model. The creep strain is modelled describing the motions of three categories of dislocations: mobile, dipole and boundary. The initial precipitate state is simulated using the thermodynamic software tool MatCalc. The particle radii and number densities are incorporated into the creep model in terms of Zener drag pressure. The Orowan's equation for creep strain rate is modified to account for tertiary creep using softening parameters related to precipitate coarsening and cavitation. For the first time the evolution of internal variables such as dislocation densities, glide velocities, effective stresses on dislocations, internal stress from the microstructure, subgrain size, pressure on subgrain boundaries and softening parameters is discussed in detail. The model is validated with experimental data of P92 steel reported in the literature.

  1. A hybrid hopfield network-simulated annealing approach for frequency assignment in satellite communications systems.

    Science.gov (United States)

    Salcedo-Sanz, Sancho; Santiago-Mozos, Ricardo; Bousoño-Calzón, Carlos

    2004-04-01

    A hybrid Hopfield network-simulated annealing algorithm (HopSA) is presented for the frequency assignment problem (FAP) in satellite communications. The goal of this NP-complete problem is minimizing the cochannel interference between satellite communication systems by rearranging the frequency assignment, for the systems can accommodate the increasing demands. The HopSA algorithm consists of a fast digital Hopfield neural network which manages the problem constraints hybridized with a simulated annealing which improves the quality of the solutions obtained. We analyze the problem and its formulation, describing and discussing the HopSA algorithm and solving a set of benchmark problems. The results obtained are compared with other existing approaches in order to show the performance of the HopSA approach.

  2. A Hybrid Approach Using an Artificial Bee Algorithm with Mixed Integer Programming Applied to a Large-Scale Capacitated Facility Location Problem

    Directory of Open Access Journals (Sweden)

    Guillermo Cabrera G.

    2012-01-01

    Full Text Available We present a hybridization of two different approaches applied to the well-known Capacitated Facility Location Problem (CFLP. The Artificial Bee algorithm (BA is used to select a promising subset of locations (warehouses which are solely included in the Mixed Integer Programming (MIP model. Next, the algorithm solves the subproblem by considering the entire set of customers. The hybrid implementation allows us to bypass certain inherited weaknesses of each algorithm, which means that we are able to find an optimal solution in an acceptable computational time. In this paper we demonstrate that BA can be significantly improved by use of the MIP algorithm. At the same time, our hybrid implementation allows the MIP algorithm to reach the optimal solution in a considerably shorter time than is needed to solve the model using the entire dataset directly within the model. Our hybrid approach outperforms the results obtained by each technique separately. It is able to find the optimal solution in a shorter time than each technique on its own, and the results are highly competitive with the state-of-the-art in large-scale optimization. Furthermore, according to our results, combining the BA with a mathematical programming approach appears to be an interesting research area in combinatorial optimization.

  3. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  4. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  5. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan. PMID:27348127

  6. Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Castro, Joseph Pete Jr. (; .); Giunta, Anthony Andrew

    2006-01-01

    Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and

  7. Identification and Prediction of Large Pedestrian Flow in Urban Areas Based on a Hybrid Detection Approach

    OpenAIRE

    Kaisheng Zhang; Mei Wang; Bangyang Wei; Daniel(Jian) Sun

    2016-01-01

    Recently, population density has grown quickly with the increasing acceleration of urbanization. At the same time, overcrowded situations are more likely to occur in populous urban areas, increasing the risk of accidents. This paper proposes a synthetic approach to recognize and identify the large pedestrian flow. In particular, a hybrid pedestrian flow detection model was constructed by analyzing real data from major mobile phone operators in China, including information from smartphones and...

  8. Tandem cylinder flow and noise predictions using a hybrid RANS/LES approach

    OpenAIRE

    M. Weinmann; Sandberg, R.D.; Doolan, C.

    2014-01-01

    The performance of a novel hybrid RANS/LES methodology for accurate flow and noise predictions of the NASA Tandem Cylinder Experiment is investigated. The proposed approach, the modified Flow Simulation Methodology (FSM), is based on scaling the turbulence viscosity and the turbulence kinetic energy dissipation rate with a damping function. This damping function consists of three individual components, a function based on the Kolmogorov length-scale ensuring correct behaviour in the direct nu...

  9. A Hybrid Approach to Composite Damage and Failure Analysis Combining Synergistic Damage Mechanics and Peridynamics

    Science.gov (United States)

    2017-06-30

    other provision of law. no person shall be subject to any penalty for failing to comply with a collection of Information if it does not display a ...From - To) 30-06-2017 Performance!Technical Report (Quarterl y) 04/0 I/2017 - 0613012017 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER A Hybrid Approach...ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Texas A &M Engineering Experiment Station (TEES) 400 Harvey Mitchell Parkway, Suite 300 Ml601473 I

  10. Identification and Prediction of Large Pedestrian Flow in Urban Areas Based on a Hybrid Detection Approach

    OpenAIRE

    Kaisheng Zhang; Mei Wang; Bangyang Wei; Daniel (Jian) Sun

    2016-01-01

    Recently, population density has grown quickly with the increasing acceleration of urbanization. At the same time, overcrowded situations are more likely to occur in populous urban areas, increasing the risk of accidents. This paper proposes a synthetic approach to recognize and identify the large pedestrian flow. In particular, a hybrid pedestrian flow detection model was constructed by analyzing real data from major mobile phone operators in China, including information from smartphones and...

  11. Hybrid imbalanced data classifier models for computational discovery of antibiotic drug targets.

    Science.gov (United States)

    Kocyigit, Yucel; Seker, Huseyin

    2014-01-01

    Identification of drug candidates is an important but also difficult process. Given drug resistance bacteria that we face, this process has become more important to identify protein candidates that demonstrate antibacterial activity. The aim of this study is therefore to develop a bioinformatics approach that is more capable of identifying a small but effective set of proteins that are expected to show antibacterial activity, subsequently to be used as antibiotic drug targets. As this is regarded as an imbalanced data classification problem due to smaller number of antibiotic drugs available, a hybrid classification model was developed and applied to the identification of antibiotic drugs. The model was developed by taking into account of various statistical models leading to the development of six different hybrid models. The best model has reached the accuracy of as high as 50% compared to earlier study with the accuracy of less than 1% as far as the proportion of the candidates identified and actual antibiotics in the candidate list is concerned.

  12. Hybrid-PIC Computer Simulation of the Plasma and Erosion Processes in Hall Thrusters

    Science.gov (United States)

    Hofer, Richard R.; Katz, Ira; Mikellides, Ioannis G.; Gamero-Castano, Manuel

    2010-01-01

    HPHall software simulates and tracks the time-dependent evolution of the plasma and erosion processes in the discharge chamber and near-field plume of Hall thrusters. HPHall is an axisymmetric solver that employs a hybrid fluid/particle-in-cell (Hybrid-PIC) numerical approach. HPHall, originally developed by MIT in 1998, was upgraded to HPHall-2 by the Polytechnic University of Madrid in 2006. The Jet Propulsion Laboratory has continued the development of HPHall-2 through upgrades to the physical models employed in the code, and the addition of entirely new ones. Primary among these are the inclusion of a three-region electron mobility model that more accurately depicts the cross-field electron transport, and the development of an erosion sub-model that allows for the tracking of the erosion of the discharge chamber wall. The code is being developed to provide NASA science missions with a predictive tool of Hall thruster performance and lifetime that can be used to validate Hall thrusters for missions.

  13. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  14. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    Science.gov (United States)

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical

  15. COED Transactions, Vol. IX, No. 3, March 1977. Evaluation of a Complex Variable Using Analog/Hybrid Computation Techniques.

    Science.gov (United States)

    Marcovitz, Alan B., Ed.

    Described is the use of an analog/hybrid computer installation to study those physical phenomena that can be described through the evaluation of an algebraic function of a complex variable. This is an alternative way to study such phenomena on an interactive graphics terminal. The typical problem used, involving complex variables, is that of…

  16. A Hybrid Latent Class Analysis Modeling Approach to Analyze Urban Expressway Crash Risk.

    Science.gov (United States)

    Yu, Rongjie; Wang, Xuesong; Abdel-Aty, Mohamed

    2017-02-07

    Crash risk analysis is rising as a hot research topic as it could reveal the relationships between traffic flow characteristics and crash occurrence risk, which is beneficial to understand crash mechanisms which would further refine the design of Active Traffic Management System (ATMS). However, the majority of the current crash risk analysis studies have ignored the impact of geometric characteristics on crash risk estimation while recent studies proved that crash occurrence risk was affected by the various alignment features. In this study, a hybrid Latent Class Analysis (LCA) modeling approach was proposed to account for the heterogeneous effects of geometric characteristics. Crashes were first segmented into homogenous subgroups, where the optimal number of latent classes was identified based on bootstrap likelihood ratio tests. Then, separate crash risk analysis models were developed using Bayesian random parameter logistic regression technique; data from Shanghai urban expressway system were employed to conduct the empirical study. Different crash risk contributing factors were unveiled by the hybrid LCA approach and better model goodness-of-fit was obtained while comparing to an overall total crash model. Finally, benefits of the proposed hybrid LCA approach were discussed.

  17. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  18. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  19. Computer modeling for investigating the stress-strainstate of beams with hybrid reinforcement

    Directory of Open Access Journals (Sweden)

    Rakhmonov Ahmadzhon Dzhamoliddinovich

    2014-01-01

    Full Text Available In this article the operation of a continuous double-span beam with hybrid reinforcement, steel and composite reinforcement under the action of concentrated forces is considered. The nature of stress-strain state of structures is investigated with the help of computer modeling using a three-dimensional model. Five models of beams with different characteristics were studied. According to the results of numerical studies the data on the distribution of stresses and displacements in continuous beams was provided. The dependence of the stress-strain state on increasing the percentage of the top reinforcement (composite of fittings and change in the concrete class is determined and presented in the article. Currently, the interest in the use of composite reinforcement as a working reinforcement of concrete structures in Russia has increased significantly, which is reflected in the increase of the number of scientific and practical publications devoted to the study of the properties and use of composite materials in construction, as well as emerging draft documents for design of such structures. One of the proposals for basalt reinforcement application is to use it in bending elements with combined reinforcement. For theoretical justification of the proposed nature of reinforcement and improvement of the calculation method the authors conduct a study of stress-strain state of continuous beams with the use of modern computing systems. The software program LIRA is most often used compared to other programs representing strain-stress state analysis of concrete structures.

  20. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-01-01

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment. PMID:28629131

  1. Solubility of nonelectrolytes: a first-principles computational approach.

    Science.gov (United States)

    Jackson, Nicholas E; Chen, Lin X; Ratner, Mark A

    2014-05-15

    Using a combination of classical molecular dynamics and symmetry adapted intermolecular perturbation theory, we develop a high-accuracy computational method for examining the solubility energetics of nonelectrolytes. This approach is used to accurately compute the cohesive energy density and Hildebrand solubility parameters of 26 molecular liquids. The energy decomposition of symmetry adapted perturbation theory is then utilized to develop multicomponent Hansen-like solubility parameters. These parameters are shown to reproduce the solvent categorizations (nonpolar, polar aprotic, or polar protic) of all molecular liquids studied while lending quantitative rigor to these qualitative categorizations via the introduction of simple, easily computable parameters. Notably, we find that by monitoring the first-order exchange energy contribution to the total interaction energy, one can rigorously determine the hydrogen bonding character of a molecular liquid. Finally, this method is applied to compute explicitly the Flory interaction parameter and the free energy of mixing for two different small molecule mixtures, reproducing the known miscibilities. This methodology represents an important step toward the prediction of molecular solubility from first principles.

  2. A hybrid approach using chaotic dynamics and global search algorithms for combinatorial optimization problems

    Science.gov (United States)

    Igeta, Hideki; Hasegawa, Mikio

    Chaotic dynamics have been effectively applied to improve various heuristic algorithms for combinatorial optimization problems in many studies. Currently, the most used chaotic optimization scheme is to drive heuristic solution search algorithms applicable to large-scale problems by chaotic neurodynamics including the tabu effect of the tabu search. Alternatively, meta-heuristic algorithms are used for combinatorial optimization by combining a neighboring solution search algorithm, such as tabu, gradient, or other search method, with a global search algorithm, such as genetic algorithms (GA), ant colony optimization (ACO), or others. In these hybrid approaches, the ACO has effectively optimized the solution of many benchmark problems in the quadratic assignment problem library. In this paper, we propose a novel hybrid method that combines the effective chaotic search algorithm that has better performance than the tabu search and global search algorithms such as ACO and GA. Our results show that the proposed chaotic hybrid algorithm has better performance than the conventional chaotic search and conventional hybrid algorithms. In addition, we show that chaotic search algorithm combined with ACO has better performance than when combined with GA.

  3. Hybrid Kalman Filter: A New Approach for Aircraft Engine In-Flight Diagnostics

    Science.gov (United States)

    Kobayashi, Takahisa; Simon, Donald L.

    2006-01-01

    In this paper, a uniquely structured Kalman filter is developed for its application to in-flight diagnostics of aircraft gas turbine engines. The Kalman filter is a hybrid of a nonlinear on-board engine model (OBEM) and piecewise linear models. The utilization of the nonlinear OBEM allows the reference health baseline of the in-flight diagnostic system to be updated to the degraded health condition of the engines through a relatively simple process. Through this health baseline update, the effectiveness of the in-flight diagnostic algorithm can be maintained as the health of the engine degrades over time. Another significant aspect of the hybrid Kalman filter methodology is its capability to take advantage of conventional linear and nonlinear Kalman filter approaches. Based on the hybrid Kalman filter, an in-flight fault detection system is developed, and its diagnostic capability is evaluated in a simulation environment. Through the evaluation, the suitability of the hybrid Kalman filter technique for aircraft engine in-flight diagnostics is demonstrated.

  4. Hybrid Block Copolymers Constituted by Peptides and Synthetic Polymers: An Overview of Synthetic Approaches, Supramolecular Behavior and Potential Applications

    Directory of Open Access Journals (Sweden)

    Jordi Puiggalí

    2013-02-01

    Full Text Available Hybrid block copolymers based on peptides and synthetic polymers, displaying different types of topologies, offer new possibilities to integrate the properties and functions of biomacromolecules and synthetic polymers in a single hybrid material. This review provides a current status report of the field concerning peptide-synthetic polymer hybrids. The first section is focused on the different synthetic approaches that have been used within the last three years for the preparation of peptide-polymer hybrids having different topologies. In the last two sections, the attractive properties, displayed in solution or in the solid state, together with the potential applications of this type of macromolecules or supramolecular systems are highlighted.

  5. Research on consistency measurement and weight estimation approach of hybrid uncertain comparison matrix

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The consistency measurement and weight estimation approach of the hybrid uncertain comparison matrix in the analytic hierarchy process (AHP) are studied. First, the decision-making satisfaction membership function is defined based on the decision making's allowable error. Then, the weight model based on the maximal satisfactory consistency idea is suggested, and the consistency index is put forward. Moreover, the weight distributing value model is developed to solve the decision making misleading problem since the multioptimization solutions in the former model. Finally, the weights are ranked based on the possibility degree approach to obtain the ultimate order.

  6. A hybrid approach to calculate the Shielding Failure-Caused Trip-out Rate

    Directory of Open Access Journals (Sweden)

    Zhou Liang

    2016-01-01

    Full Text Available Lightning has become a big threat to the safe operation of the main transmission line. Reasonable and accurate calculation of shielding failure rate plays important role in transmission line and tower design. This paper proposes a hybrid approach to calculate the shielding failure-caused trip-out rate, based on the typical electro-geometric model and the regulation method. The case study prove the validity and correctness of this approach, by comparing with the actual operation shielding failure rate.

  7. A Hybrid Wavelet Transform Based Short-Term Wind Speed Forecasting Approach

    OpenAIRE

    Jujie Wang

    2014-01-01

    It is important to improve the accuracy of wind speed forecasting for wind parks management and wind power utilization. In this paper, a novel hybrid approach known as WTT-TNN is proposed for wind speed forecasting. In the first step of the approach, a wavelet transform technique (WTT) is used to decompose wind speed into an approximate scale and several detailed scales. In the second step, a two-hidden-layer neural network (TNN) is used to predict both approximated scale and detailed scales,...

  8. [Computer work and De Quervain's tenosynovitis: an evidence based approach].

    Science.gov (United States)

    Gigante, M R; Martinotti, I; Cirla, P E

    2012-01-01

    The debate around the role of the work at personal computer as cause of De Quervain's Tenosynovitis was developed partially, without considering multidisciplinary available data. A systematic review of the literature, using an evidence-based approach, was performed. In disorders associated with the use of VDU, we must distinguish those at the upper limbs and among them those related to an overload. Experimental studies on the occurrence of De Quervain's Tenosynovitis are quite limited, as well as clinically are quite difficult to prove the professional etiology, considering the interference due to other activities of daily living or to the biological susceptibility (i.e. anatomical variability, sex, age, exercise). At present there is no evidence of any connection between De Quervain syndrome and time of use of the personal computer or keyboard, limited evidence of correlation is found with time using a mouse. No data are available regarding the use exclusively or predominantly for personal laptops or mobile "smart phone".

  9. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  10. Benchmarking of computer codes and approaches for modeling exposure scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.R. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rittmann, P.D.; Wood, M.I. [Westinghouse Hanford Co., Richland, WA (United States); Cook, J.R. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.

  11. Computational approaches for rational design of proteins with novel functionalities

    Directory of Open Access Journals (Sweden)

    Manish Kumar Tiwari

    2012-09-01

    Full Text Available Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  12. Computational approaches for rational design of proteins with novel functionalities.

    Science.gov (United States)

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  13. Modelling of Hybrid Materials and Interface Defects through Homogenization Approach for the Prediction of Effective Thermal Conductivity of FRP Composites Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    C. Mahesh

    2013-01-01

    Full Text Available Finite element method is effectively used to homogenize the thermal conductivity of FRP composites consisting of hybrid materials and fibre-matrix debonds at some of the fibres. The homogenized result at microlevel is used to determine the property of the layer using macromechanics principles; thereby, it is possible to minimize the computational efforts required to solve the problem as in state through only micromechanics approach. The working of the proposed procedure is verified for three different problems: (i hybrid composite having two different fibres in alternate layers, (ii fibre-matrix interface debond in alternate layers, and (iii fibre-matrix interface debond at one fibre in a group of four fibres in one unit cell. It is observed that the results are in good agreement with those obtained through pure micro-mechanics approach.

  14. Mapping Seasonal Evapotranspiration and Root Zone Soil Moisture using a Hybrid Modeling Approach over Vineyards

    Science.gov (United States)

    Geli, H. M. E.

    2015-12-01

    Estimates of actual crop evapotranspiration (ETa) at field scale over the growing season are required for improving agricultural water management, particularly in water limited and drought prone regions. Remote sensing data from multiple platforms such as airborne and Landsat-based sensors can be used to provide these estimates. Combining these data with surface energy balance models can provide ETa estimates at sub- field scale as well as information on vegetation stress and soil moisture conditions. However, the temporal resolution of airborne and Landsat data does not allow for a continuous ETa monitoring over the course of the growing season. This study presents the application of a hybrid ETa modeling approach developed for monitoring daily ETa and root zone available water at high spatial resolutions. The hybrid ETa modeling approach couples a thermal-based energy balance model with a water balance-based scheme using data assimilation. The two source energy balance (TSEB) model is used to estimate instantaneous ETa which can be extrapolated to daily ETa using a water balance model modified to use the reflectance-based basal crop coefficient for interpolating ETa in between airborne and/or Landsat overpass dates. Moreover, since it is a water balance model, the soil moisture profile is also estimated. The hybrid ETa approach is applied over vineyard fields in central California. High resolution airborne and Landsat imagery were used to drive the hybrid model. These images were collected during periods that represented different vine phonological stages in 2013 growing season. Estimates of daily ETa and surface energy balance fluxes will be compared with ground-based eddy covariance tower measurements. Estimates of soil moisture at multiple depths will be compared with measurements.

  15. Hybrid Modelling Approach to Prairie hydrology: Fusing Data-driven and Process-based Hydrological Models

    Science.gov (United States)

    Mekonnen, B.; Nazemi, A.; Elshorbagy, A.; Mazurek, K.; Putz, G.

    2012-04-01

    Modeling the hydrological response in prairie regions, characterized by flat and undulating terrain, and thus, large non-contributing areas, is a known challenge. The hydrological response (runoff) is the combination of the traditional runoff from the hydrologically contributing area and the occasional overflow from the non-contributing area. This study provides a unique opportunity to analyze the issue of fusing the Soil and Water Assessment Tool (SWAT) and Artificial Neural Networks (ANNs) in a hybrid structure to model the hydrological response in prairie regions. A hybrid SWAT-ANN model is proposed, where the SWAT component and the ANN module deal with the effective (contributing) area and the non-contributing area, respectively. The hybrid model is applied to the case study of Moose Jaw watershed, located in southern Saskatchewan, Canada. As an initial exploration, a comparison between ANN and SWAT models is established based on addressing the daily runoff (streamflow) prediction accuracy using multiple error measures. This is done to identify the merits and drawbacks of each modeling approach. It has been found out that the SWAT model has better performance during the low flow periods but with degraded efficiency during periods of high flows. The case is different for the ANN model as ANNs exhibit improved simulation during high flow periods but with biased estimates during low flow periods. The modelling results show that the new hybrid SWAT-ANN model is capable of exploiting the strengths of both SWAT and ANN models in an integrated framrwork. The new hybrid SWAT-ANN model simulates daily runoff quite satisfactorily with NSE measures of 0.80 and 0.83 during calibration and validation periods, respectively. Furthermore, an experimental assessment was performed to identify the effects of the ANN training method on the performance of the hybrid model as well as the parametric identifiability. Overall, the results obtained in this study suggest that the fusion

  16. Automatic artefact removal in a self-paced hybrid brain- computer interface system

    Directory of Open Access Journals (Sweden)

    Yong Xinyi

    2012-07-01

    Full Text Available Abstract Background A novel artefact removal algorithm is proposed for a self-paced hybrid brain-computer interface (BCI system. This hybrid system combines a self-paced BCI with an eye-tracker to operate a virtual keyboard. To select a letter, the user must gaze at the target for at least a specific period of time (dwell time and then activate the BCI by performing a mental task. Unfortunately, electroencephalogram (EEG signals are often contaminated with artefacts. Artefacts change the quality of EEG signals and subsequently degrade the BCI’s performance. Methods To remove artefacts in EEG signals, the proposed algorithm uses the stationary wavelet transform combined with a new adaptive thresholding mechanism. To evaluate the performance of the proposed algorithm and other artefact handling/removal methods, semi-simulated EEG signals (i.e., real EEG signals mixed with simulated artefacts and real EEG signals obtained from seven participants are used. For real EEG signals, the hybrid BCI system’s performance is evaluated in an online-like manner, i.e., using the continuous data from the last session as in a real-time environment. Results With semi-simulated EEG signals, we show that the proposed algorithm achieves lower signal distortion in both time and frequency domains. With real EEG signals, we demonstrate that for dwell time of 0.0s, the number of false-positives/minute is 2 and the true positive rate (TPR achieved by the proposed algorithm is 44.7%, which is more than 15.0% higher compared to other state-of-the-art artefact handling methods. As dwell time increases to 1.0s, the TPR increases to 73.1%. Conclusions The proposed artefact removal algorithm greatly improves the BCI’s performance. It also has the following advantages: a it does not require additional electrooculogram/electromyogram channels, long data segments or a large number of EEG channels, b it allows real-time processing, and c it reduces signal distortion.

  17. Computational systems biology approaches to anti-angiogenic cancer therapeutics.

    Science.gov (United States)

    Finley, Stacey D; Chu, Liang-Hui; Popel, Aleksander S

    2015-02-01

    Angiogenesis is an exquisitely regulated process that is required for physiological processes and is also important in numerous diseases. Tumors utilize angiogenesis to generate the vascular network needed to supply the cancer cells with nutrients and oxygen, and many cancer drugs aim to inhibit tumor angiogenesis. Anti-angiogenic therapy involves inhibiting multiple cell types, molecular targets, and intracellular signaling pathways. Computational tools are useful in guiding treatment strategies, predicting the response to treatment, and identifying new targets of interest. Here, we describe progress that has been made in applying mathematical modeling and bioinformatics approaches to study anti-angiogenic therapeutics in cancer.

  18. Approaches to Computer Modeling of Phosphate Hide-Out.

    Science.gov (United States)

    1984-06-28

    phosphate acts as a buffer to keep pH at a value above which acid corrosion occurs . and below which caustic corrosion becomes significant. Difficulties are...ionization of dihydrogen phosphate : HIPO - + + 1PO, K (B-7) H+ + - £Iao 1/1, (B-8) H , PO4 - + O- - H0 4 + H20 K/Kw (0-9) 19 * Such zero heat...OF STANDARDS-1963-A +. .0 0 0 9t~ - 4 NRL Memorandum Report 5361 4 Approaches to Computer Modeling of Phosphate Hide-Out K. A. S. HARDY AND J. C

  19. Autonomous corrosion detection in gas pipelines: a hybrid-fuzzy classifier approach using ultrasonic nondestructive evaluation protocols.

    Science.gov (United States)

    Qidwai, Uvais A

    2009-12-01

    In this paper, a customized classifier is presented for the industry-practiced nondestructive evaluation (NDE) protocols using a hybrid-fuzzy inference system (FIS) to classify the corrosion and distinguish it from the geometric defects or normal/healthy state of the steel pipes used in the gas/petroleum industry. The presented system is hybrid in the sense that it utilizes both soft computing through fuzzy set theory, as well as conventional parametric modeling through H(infinity) optimization methods. Due to significant uncertainty in the power spectral density of the noise in ultrasonic NDE procedures, the use of optimal H(2) estimators for defect characterization is not so accurate. A more appropriate criterion is the H(infinity) norm of the estimation error spectrum which is based on minimization of the magnitude of this spectrum and hence produces more robust estimates. A hybrid feature set is developed in this work that corresponds to a) geometric features extracted directly from the raw ultrasonic A-scan data (which are the ultrasonic echo pulses in 1-Dtraveling inside the metal perpendicular to its 2 surfaces) and b) mapped features from the impulse response of the estimated model of the defect waveform under study. An experimental strategy is first outlined, through which the necessary data are collected as A-scans. Then, using the H(infinity) estimation approach, a parametric transfer function is obtained for each pulse. In this respect, each A-scan is treated as output from a defining function when a pure/healthy metal's A-scan is used as its input. Three defining states are considered in the paper; healthy, corroded, and defective, where the defective class represents metal with artificial or other defects. The necessary features are then calculated and are then supplied to the fuzzy inference system as input to be used in the classification. The resulting system has shown excellent corrosion classification with very low misclassification and false

  20. A hybrid mixture discriminant analysis-random forest computational model for the prediction of volume of distribution of drugs in human.

    Science.gov (United States)

    Lombardo, Franco; Obach, R Scott; Dicapua, Frank M; Bakken, Gregory A; Lu, Jing; Potter, David M; Gao, Feng; Miller, Michael D; Zhang, Yao

    2006-04-06

    A computational approach is described that can predict the VD(ss) of new compounds in humans, with an accuracy of within 2-fold of the actual value. A dataset of VD values for 384 drugs in humans was used to train a hybrid mixture discriminant analysis-random forest (MDA-RF) model using 31 computed descriptors. Descriptors included terms describing lipophilicity, ionization, molecular volume, and various molecular fragments. For a test set of 23 proprietary compounds not used in model construction, the geometric mean fold-error (GMFE) was 1.78-fold (+/-11.4%). The model was also tested using a leave-class out approach wherein subsets of drugs based on therapeutic class were removed from the training set of 384, the model was recast, and the VD(ss) values for each of the subsets were predicted. GMFE values ranged from 1.46 to 2.94-fold, depending on the subset. Finally, for an additional set of 74 compounds, VD(ss) predictions made using the computational model were compared to predictions made using previously described methods dependent on animal pharmacokinetic data. Computational VD(ss) predictions were, on average, 2.13-fold different from the VD(ss) predictions from animal data. The computational model described can predict human VD(ss) with an accuracy comparable to predictions requiring substantially greater effort and can be applied in place of animal experimentation.