Sample records for current computer systems

  1. Current Trends in Cloud Computing A Survey of Cloud Computing Systems

    Directory of Open Access Journals (Sweden)

    Harjit Singh


    Full Text Available Cloud computing that has become an increasingly important trend, is a virtualization technology that uses the internet and central remote servers to offer the sharing of resources that include infrastructures, software, applications and business processes to the market environment to fulfill the elastic demand. In today’s competitive environment, the service vitality, elasticity, choices and flexibility offered by this scalable technology are too attractive that makes the cloud computing to increasingly becoming an integral part of the enterprise computing environment. This paper presents a survey of the current state of Cloud Computing. It includes a discussion of the evolution process of cloud computing, characteristics of Cloud, current technologies adopted in cloud computing, This paper also presents a comparative study of cloud computing platforms (Amazon, Google and Microsoft, and its challenges.

  2. Tensor computations in computer algebra systems

    CERN Document Server

    Korolkova, A V; Sevastyanov, L A


    This paper considers three types of tensor computations. On their basis, we attempt to formulate criteria that must be satisfied by a computer algebra system dealing with tensors. We briefly overview the current state of tensor computations in different computer algebra systems. The tensor computations are illustrated with appropriate examples implemented in specific systems: Cadabra and Maxima.

  3. Computer systems (United States)

    Olsen, Lola


    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  4. Improving Control System Security through the Evaluation of Current Trends in Computer Security Research

    Energy Technology Data Exchange (ETDEWEB)



    At present, control system security efforts are primarily technical and reactive in nature. What has been overlooked is the need for proactive efforts, focused on the IT security research community from which new threats might emerge. Evaluating cutting edge IT security research and how it is evolving can provide defenders with valuable information regarding what new threats and tools they can anticipate in the future. Only known attack methodologies can be blocked, and there is a gap between what is known to the general security community and what is being done by cutting edge researchers --both those trying to protect systems and those trying to compromise them. The best security researchers communicate with others in their field; they know what cutting edge research is being done; what software can be penetrated via this research; and what new attack techniques and methodologies are being circulated in the black hat community. Standardization of control system applications, operating systems, and networking protocols is occurring at a rapid rate, following a path similar to the standardization of modern IT networks. Many attack methodologies used on IT systems can be ported over to the control system environment with little difficulty. It is extremely important to take advantage of the lag time between new research, its use on traditional IT networks, and the time it takes to port the research over for use on a control system network. Analyzing nascent trends in IT security and determining their applicability to control system networks provides significant information regarding defense mechanisms needed to secure critical infrastructure more effectively. This work provides the critical infrastructure community with a better understanding of how new attacks might be launched, what layers of defense will be needed to deter them, how the attacks could be detected, and how their impact could be limited.

  5. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Ja (Dept. of Radiology, Seoul Metropolitan Government - Seoul National Univ. Boramae Medical Center, Seoul (Korea, Republic of)); Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min (Dept. of Radiology, Seoul National Univ. Hospital, Seoul (Korea, Republic of)), email:


    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  6. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony


    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  7. Resilient computer system design

    CERN Document Server

    Castano, Victor


    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  8. Exploring end users' system requirements for a handheld computer supporting both sepsis test workflow and current IT solutions. (United States)

    Samson, Lasse Lefevre; Pape-Haugaard, Louise Bilenberg; Søgaard, Mette; Schønheyder, Henrik Carl; Hejlesen, Ole K


    Sepsis is a systemic response associated with very high mortality. Early initiation of the correct antimicrobial therapy remains a cornerstone in the treatment of sepsis. Currently, a new microbiological test is under development, which aims to detect major, prevalent pathogens in positive blood cultures within an hour. Concurrently, a tablet-based data entry and reporting system will be developed to facilitate the workflow of the test. This study investigated the system requirements for the tablet-based data entry and reporting system in order to support the clinical workflow. By observing the workflow of the blood culture analysis and through interviews with medical laboratory technicians, four main system requirements were identified. The system requirements are; the ability to receive and send data to the laboratory information system, support for the use of barcodes, the ability to access a browser based instruction system, and communication of results between medical laboratory technicians and physicians. These system requirements will be used as a basis in the future development of the tablet-based data entry and reporting system.

  9. Computer system identification


    Lesjak, Borut


    The concept of computer system identity in computer science bears just as much importance as does the identity of an individual in a human society. Nevertheless, the identity of a computer system is incomparably harder to determine, because there is no standard system of identification we could use and, moreover, a computer system during its life-time is quite indefinite, since all of its regular and necessary hardware and software upgrades soon make it almost unrecognizable: after a number o...

  10. Distributed computer control systems

    Energy Technology Data Exchange (ETDEWEB)

    Suski, G.J.


    This book focuses on recent advances in the theory, applications and techniques for distributed computer control systems. Contents (partial): Real-time distributed computer control in a flexible manufacturing system. Semantics and implementation problems of channels in a DCCS specification. Broadcast protocols in distributed computer control systems. Design considerations of distributed control architecture for a thermal power plant. The conic toolset for building distributed systems. Network management issues in distributed control systems. Interprocessor communication system architecture in a distributed control system environment. Uni-level homogenous distributed computer control system and optimal system design. A-nets for DCCS design. A methodology for the specification and design of fault tolerant real time systems. An integrated computer control system - architecture design, engineering methodology and practical experience.

  11. ALMA correlator computer systems (United States)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus


    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  12. Fault tolerant computing systems

    CERN Document Server

    Randell, B


    Fault tolerance involves the provision of strategies for error detection, damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (15 refs).

  13. Computer controlled antenna system (United States)

    Raumann, N. A.


    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  14. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić


    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  15. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles


    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  16. Computation of current distributions using FEMLAB

    Energy Technology Data Exchange (ETDEWEB)

    Shankar, M.S.; Pullabhotla, S.R. [Vellore Institute of Technology, Tamilnadu (India); Vijayasekaran, B. [Central Electrochemical Research Institute, Tamilnadu (India); Chemical Engineering, Tennessee Technological University, Tennessee (United States); Basha, C.A.


    An efficient method for the computation of current density and surface concentration distributions in electrochemical processes is analyzed using the commercial mathematical software FEMLAB. To illustrate the utility of the software, the procedure is applied to some realistic problems encountered in electrochemical engineering, such as current distribution in a continuous moving electrode, parallel plate electrode, hull cell, curvilinear hull cell, thin layer galvanic cell, through-hole plating, and a recessed disc electrode. The model equations of the above cases are considered and their implementations into the software, FEMLAB, are analyzed. The technique is attractive because it involves a systematic way of coupling equations to perform case studies. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  17. Current limiter circuit system

    Energy Technology Data Exchange (ETDEWEB)

    Witcher, Joseph Brandon; Bredemann, Michael V.


    An apparatus comprising a steady state sensing circuit, a switching circuit, and a detection circuit. The steady state sensing circuit is connected to a first, a second and a third node. The first node is connected to a first device, the second node is connected to a second device, and the steady state sensing circuit causes a scaled current to flow at the third node. The scaled current is proportional to a voltage difference between the first and second node. The switching circuit limits an amount of current that flows between the first and second device. The detection circuit is connected to the third node and the switching circuit. The detection circuit monitors the scaled current at the third node and controls the switching circuit to limit the amount of the current that flows between the first and second device when the scaled current is greater than a desired level.

  18. Computer network defense system (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb


    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  19. Computer system operation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new.

  20. Defining and resolving current systems in geospace (United States)

    Ganushkina, N. Y.; Liemohn, M. W.; Dubyagin, S.; Daglis, I. A.; Dandouras, I.; De Zeeuw, D. L.; Ebihara, Y.; Ilie, R.; Katus, R.; Kubyshkina, M.; Milan, S. E.; Ohtani, S.; Ostgaard, N.; Reistad, J. P.; Tenfjord, P.; Toffoletto, F.; Zaharia, S.; Amariutei, O.


    Electric currents flowing through near-Earth space (R ≤ 12 RE) can support a highly distorted magnetic field topology, changing particle drift paths and therefore having a nonlinear feedback on the currents themselves. A number of current systems exist in the magnetosphere, most commonly defined as (1) the dayside magnetopause Chapman-Ferraro currents, (2) the Birkeland field-aligned currents with high-latitude "region 1" and lower-latitude "region 2" currents connected to the partial ring current, (3) the magnetotail currents, and (4) the symmetric ring current. In the near-Earth nightside region, however, several of these current systems flow in close proximity to each other. Moreover, the existence of other temporal current systems, such as the substorm current wedge or "banana" current, has been reported. It is very difficult to identify a local measurement as belonging to a specific system. Such identification is important, however, because how the current closes and how these loops change in space and time governs the magnetic topology of the magnetosphere and therefore controls the physical processes of geospace. Furthermore, many methods exist for identifying the regions of near-Earth space carrying each type of current. This study presents a robust collection of these definitions of current systems in geospace, particularly in the near-Earth nightside magnetosphere, as viewed from a variety of observational and computational analysis techniques. The influence of definitional choice on the resulting interpretation of physical processes governing geospace dynamics is presented and discussed.

  1. Computer Vision Systems (United States)

    Gunasekaran, Sundaram

    Food quality is of paramount consideration for all consumers, and its importance is perhaps only second to food safety. By some definition, food safety is also incorporated into the broad categorization of food quality. Hence, the need for careful and accurate evaluation of food quality is at the forefront of research and development both in the academia and industry. Among the many available methods for food quality evaluation, computer vision has proven to be the most powerful, especially for nondestructively extracting and quantifying many features that have direct relevance to food quality assessment and control. Furthermore, computer vision systems serve to rapidly evaluate the most readily observable foods quality attributes - the external characteristics such as color, shape, size, surface texture etc. In addition, it is now possible, using advanced computer vision technologies, to “see” inside a food product and/or package to examine important quality attributes ordinarily unavailable to human evaluators. With rapid advances in electronic hardware and other associated imaging technologies, the cost-effectiveness and speed of computer vision systems have greatly improved and many practical systems are already in place in the food industry.

  2. Computational systems chemical biology. (United States)

    Oprea, Tudor I; May, Elebeoba E; Leitão, Andrei; Tropsha, Alexander


    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology (SCB) (Nat Chem Biol 3: 447-450, 2007).The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules, and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology/systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology, and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology.

  3. Response Current from Spin-Vortex-Induced Loop Current System to Feeding Current (United States)

    Morisaki, Tsubasa; Wakaura, Hikaru; Abou Ghantous, Michel; Koizumi, Hiroyasu


    The spin-vortex-induced loop current (SVILC) is a loop current generated around a spin-vortex formed by itinerant electrons. It is generated by a U(1) instanton created by the single-valued requirement of wave functions with respect to the coordinate, and protected by the topological number, "winding number". In a system with SVILCs, a macroscopic persistent current is generated as a collection of SVILCs. In the present work, we consider the situation where external currents are fed in the SVILC system and response currents are measured as spontaneous currents that flow through leads attached to the SVILC system. The response currents from SVILC systems are markedly different from the feeding currents in their directions and magnitude, and depend on the original current pattern of the SVILC system; thus, they may be used in the readout process in the recently proposed SVILC quantum computer, a quantum computer that utilizes SVILCs as qubits. We also consider the use of the response current to detect SVILCs.

  4. The Remote Computer Control (RCC) system (United States)

    Holmes, W.


    A system to remotely control job flow on a host computer from any touchtone telephone is briefly described. Using this system a computer programmer can submit jobs to a host computer from any touchtone telephone. In addition the system can be instructed by the user to call back when a job is finished. Because of this system every touchtone telephone becomes a conversant computer peripheral. This system known as the Remote Computer Control (RCC) system utilizes touchtone input, touchtone output, voice input, and voice output. The RCC system is microprocessor based and is currently using the INTEL 80/30microcomputer. Using the RCC system a user can submit, cancel, and check the status of jobs on a host computer. The RCC system peripherals consist of a CRT for operator control, a printer for logging all activity, mass storage for the storage of user parameters, and a PROM card for program storage.

  5. HLS bunch current measurement system

    Institute of Scientific and Technical Information of China (English)


    Bunch current is an important parameter for studying the injection fill-pattern in the storage ring and the instability threshold of the bunch, and the bunch current monitor also is an indispensable tool for the top-up injection. A bunch current measurement (BCM) system has been developed to meet the needs of the upgrade project of Hefei Light Source (HLS). This paper presents the layout of the BCM system. The system based on a high-speed digital oscilloscope can be used to measure the bunch current and synchronous phase shift. To obtain the absolute value of bunch-by-bunch current, the calibration coefficient is measured and analyzed. Error analysis shows that the RMS of bunch current is less than 0.01 mA when bunch current is about 5 mA, which can meet project requirement.

  6. Energy current loss instability model on a computer (United States)

    Edighoffer, John A.


    The computer program called Energy Stability in a Recirculating Accelerator (ESRA) Free Electron Laser (FEL) has been written to model bunches of particles in longitudinal phase space transversing a recirculating accelerator and the associated rf changes and aperture current losses. This energy-current loss instability was first seen by Los Alamos's FEL group in their energy recovery experiments. This code addresses these stability issues and determines the transport, noise, feedback and other parameters for which these FEL systems are stable or unstable. Two representative systems are modeled, one for the Novosibirisk high power FEL racetrack microtron for photochemical research, the other is the CEBAF proposed UV FEL system. Both of these systems are stable with prudent choices of parameters.

  7. Current Computer Network Security Issues/Threats

    National Research Council Canada - National Science Library

    Ammar Yassir; Alaa A K Ismaeel


    Computer network security has been a subject of concern for a long period. Many efforts have been made to address the existing and emerging threats such as viruses and Trojan among others without any significant success...

  8. Computational thermodynamics in electric current metallurgy

    DEFF Research Database (Denmark)

    Bhowmik, Arghya; Qin, R.S.


    A priori derivation for the extra free energy caused by the passing electric current in metal is presented. The analytical expression and its discrete format in support of the numerical calculation of thermodynamics in electric current metallurgy have been developed. This enables the calculation...... of electric current distribution, current induced temperature distribution and free energy sequence of various phase transitions in multiphase materials. The work is particularly suitable for the study of magnetic materials that contain various magnetic phases. The latter has not been considered in literature....... The method has been validated against the analytical solution of current distribution and experimental observation of microstructure evolution. It provides a basis for the design, prediction and implementation of the electric current metallurgy. The applicability of the theory is discussed in the derivations....

  9. Digital optical computers at the optoelectronic computing systems center (United States)

    Jordan, Harry F.


    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  10. The Computational Sensorimotor Systems Laboratory (United States)

    Federal Laboratory Consortium — The Computational Sensorimotor Systems Lab focuses on the exploration, analysis, modeling and implementation of biological sensorimotor systems for both scientific...


    Directory of Open Access Journals (Sweden)

    Frantisek Horvat


    Full Text Available This paper deals with the idea of an energy harvesting (EH system that uses the mechanical energy from finger presses on the buttons of a computer mouse by means of a piezomaterial (PVF2. The piezomaterial is placed in the mouse at the interface between the button and the body. This paper reviews the parameters of the PVF2 piezomaterial and tests their possible implementation into EH systems utilizing these types of mechanical interactions. The paper tests the viability of two EH concepts: a battery management system, and a semi-autonomous system. A statistical estimate of the button operations is performed for various computer activities, showing that an average of up to 3300 mouse clicks per hour was produced for gaming applications, representing a tip frequency of 0.91 Hz on the PVF2 member. This frequency is tested on the PVF2 system, and an assessment of the two EH systems is reviewed. The results show that fully autonomous systems are not suitable for capturing low-frequency mechanical interactions, due to the parameters of current piezomaterials, and the resulting very long startup phase. However, a hybrid EH system which uses available power to initiate the circuit and eliminate the startup phase may be explored for future studies.

  12. Computer program for allocation of generators in isolated systems of direct current using genetic algorithm; Programa computacional para alocacao de geradores em sistemas isolados de corrente continua utilizando algoritmo genetico

    Energy Technology Data Exchange (ETDEWEB)

    Gewehr, Diego N.; Vargas, Ricardo B.; Melo, Eduardo D. de; Paschoareli Junior, Dionizio [Universidade Estadual Paulista (DEE/UNESP), Ilha Solteira, SP (Brazil). Dept. de Engenharia Eletrica. Grupo de Pesquisa em Fontes Alternativas e Aproveitamento de Energia


    This paper presents a methodology for electric power sources location in isolated direct current micro grids, using genetic algorithm. In this work, photovoltaic panels are considered, although the methodology can be extended for any kind of DC sources. A computational tool is developed using the Matlab simulator, to obtain the best dc system configuration for reduction of panels quantity and costs, and to improve the system performance. (author)

  13. Secure computing on reconfigurable systems

    NARCIS (Netherlands)

    Fernandes Chaves, R.J.


    This thesis proposes a Secure Computing Module (SCM) for reconfigurable computing systems. SC provides a protected and reliable computational environment, where data security and protection against malicious attacks to the system is assured. SC is strongly based on encryption algorithms and on the

  14. Secure computing on reconfigurable systems

    NARCIS (Netherlands)

    Fernandes Chaves, R.J.


    This thesis proposes a Secure Computing Module (SCM) for reconfigurable computing systems. SC provides a protected and reliable computational environment, where data security and protection against malicious attacks to the system is assured. SC is strongly based on encryption algorithms and on the a

  15. Computer systems a programmer's perspective

    CERN Document Server

    Bryant, Randal E


    Computer systems: A Programmer’s Perspective explains the underlying elements common among all computer systems and how they affect general application performance. Written from the programmer’s perspective, this book strives to teach readers how understanding basic elements of computer systems and executing real practice can lead them to create better programs. Spanning across computer science themes such as hardware architecture, the operating system, and systems software, the Third Edition serves as a comprehensive introduction to programming. This book strives to create programmers who understand all elements of computer systems and will be able to engage in any application of the field--from fixing faulty software, to writing more capable programs, to avoiding common flaws. It lays the groundwork for readers to delve into more intensive topics such as computer architecture, embedded systems, and cybersecurity. This book focuses on systems that execute an x86-64 machine code, and recommends th...

  16. Central nervous system and computation. (United States)

    Guidolin, Diego; Albertin, Giovanna; Guescini, Michele; Fuxe, Kjell; Agnati, Luigi F


    Computational systems are useful in neuroscience in many ways. For instance, they may be used to construct maps of brain structure and activation, or to describe brain processes mathematically. Furthermore, they inspired a powerful theory of brain function, in which the brain is viewed as a system characterized by intrinsic computational activities or as a "computational information processor. "Although many neuroscientists believe that neural systems really perform computations, some are more cautious about computationalism or reject it. Thus, does the brain really compute? Answering this question requires getting clear on a definition of computation that is able to draw a line between physical systems that compute and systems that do not, so that we can discern on which side of the line the brain (or parts of it) could fall. In order to shed some light on the role of computational processes in brain function, available neurobiological data will be summarized from the standpoint of a recently proposed taxonomy of notions of computation, with the aim of identifying which brain processes can be considered computational. The emerging picture shows the brain as a very peculiar system, in which genuine computational features act in concert with noncomputational dynamical processes, leading to continuous self-organization and remodeling under the action of external stimuli from the environment and from the rest of the organism.

  17. Computer-aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.


    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP).

  18. Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Friday, Adrian


    First introduced two decades ago, the term ubiquitous computing is now part of the common vernacular. Ubicomp, as it is commonly called, has grown not just quickly but broadly so as to encompass a wealth of concepts and technology that serves any number of purposes across all of human endeavor......, an original ubicomp pioneer, Ubiquitous Computing Fundamentals brings together eleven ubiquitous computing trailblazers who each report on his or her area of expertise. Starting with a historical introduction, the book moves on to summarize a number of self-contained topics. Taking a decidedly human...... perspective, the book includes discussion on how to observe people in their natural environments and evaluate the critical points where ubiquitous computing technologies can improve their lives. Among a range of topics this book examines: How to build an infrastructure that supports ubiquitous computing...

  19. Capability-based computer systems

    CERN Document Server

    Levy, Henry M


    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  20. New computing systems and their impact on computational mechanics (United States)

    Noor, Ahmed K.


    Recent advances in computer technology that are likely to impact computational mechanics are reviewed. The technical needs for computational mechanics technology are outlined. The major features of new and projected computing systems, including supersystems, parallel processing machines, special-purpose computing hardware, and small systems are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism on multiprocessor computers with a shared memory.

  1. A Management System for Computer Performance Evaluation. (United States)


    large unused capacity indicates a potential cost performance improvement (i.e. the potential to perform more within current costs or reduce costs ...necessary to bring the performance of the computer system in line with operational goals. : (Ref. 18 : 7) The General Accouting Office estimates that the...tasks in attempting to improve the efficiency and effectiveness of their computer systems. Cost began to plan an important role in the life of a

  2. Trusted computing for embedded systems

    CERN Document Server

    Soudris, Dimitrios; Anagnostopoulos, Iraklis


    This book describes the state-of-the-art in trusted computing for embedded systems. It shows how a variety of security and trusted computing problems are addressed currently and what solutions are expected to emerge in the coming years. The discussion focuses on attacks aimed at hardware and software for embedded systems, and the authors describe specific solutions to create security features. Case studies are used to present new techniques designed as industrial security solutions. Coverage includes development of tamper resistant hardware and firmware mechanisms for lightweight embedded devices, as well as those serving as security anchors for embedded platforms required by applications such as smart power grids, smart networked and home appliances, environmental and infrastructure sensor networks, etc. ·         Enables readers to address a variety of security threats to embedded hardware and software; ·         Describes design of secure wireless sensor networks, to address secure authen...

  3. Computational structures technology at Grumman: Current practice/future needs (United States)

    Pifko, Allan B.; Eidinoff, Harvey


    The current practice for the design analysis of new airframe structural systems is to construct a master finite element model of the vehicle in order to develop internal load distributions. The inputs to this model include the geometry which is taken directly from CADAM and CATIA structural layout and aerodynamic loads and mass distribution computer models. This master model is sufficiently detailed to define major load paths and for the computation of dynamic mode shapes and structural frequencies, but not detailed enough to define local stress gradients and notch stresses. This master model is then used to perform structural optimization studies that will provide minimum weights for major structural members. The post-processed output from the master model, load, stress, and strain analysis is then used by structural analysts to perform detailed stress analysis of local regions in order to design local structure with all its required details. This local analysis consists of hand stress analysis and life prediction analysis with the assistance of manuals, design charts, computer stress and structural life analysis and sometimes finite element or boundary element analysis. The resulting design is verified by fatigue tests.

  4. Computer Security Systems Enable Access. (United States)

    Riggen, Gary


    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  5. Current Trends in Computer-Based Education in Medicine (United States)

    Farquhar, Barbara B.; Votaw, Robert G.


    Important current trends in the use of computer technology to enhance medical education are reported in the areas of simulation and assessment of clinical competence, curriculum integration, financial support, and means of exchanging views and scientific information. (RAO)

  6. Energy efficient distributed computing systems

    CERN Document Server

    Lee, Young-Choon


    The energy consumption issue in distributed computing systems raises various monetary, environmental and system performance concerns. Electricity consumption in the US doubled from 2000 to 2005.  From a financial and environmental standpoint, reducing the consumption of electricity is important, yet these reforms must not lead to performance degradation of the computing systems.  These contradicting constraints create a suite of complex problems that need to be resolved in order to lead to 'greener' distributed computing systems.  This book brings together a group of outsta

  7. Dynamical Systems Some Computational Problems

    CERN Document Server

    Guckenheimer, J; Guckenheimer, John; Worfolk, Patrick


    We present several topics involving the computation of dynamical systems. The emphasis is on work in progress and the presentation is informal -- there are many technical details which are not fully discussed. The topics are chosen to demonstrate the various interactions between numerical computation and mathematical theory in the area of dynamical systems. We present an algorithm for the computation of stable manifolds of equilibrium points, describe the computation of Hopf bifurcations for equilibria in parametrized families of vector fields, survey the results of studies of codimension two global bifurcations, discuss a numerical analysis of the Hodgkin and Huxley equations, and describe some of the effects of symmetry on local bifurcation.

  8. Computational Systems Chemical Biology


    Oprea, Tudor I.; Elebeoba E. May; Leitão, Andrei; Tropsha, Alexander


    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007).

  9. Hybridity in Embedded Computing Systems

    Institute of Scientific and Technical Information of China (English)

    虞慧群; 孙永强


    An embedded system is a system that computer is used as a component in a larger device.In this paper,we study hybridity in embedded systems and present an interval based temporal logic to express and reason about hybrid properties of such kind of systems.

  10. Building Low Cost Cloud Computing Systems

    Directory of Open Access Journals (Sweden)

    Carlos Antunes


    Full Text Available The actual models of cloud computing are based in megalomaniac hardware solutions, being its implementation and maintenance unaffordable to the majority of service providers. The use of jail services is an alternative to current models of cloud computing based on virtualization. Models based in utilization of jail environments instead of the used virtualization systems will provide huge gains in terms of optimization of hardware resources at computation level and in terms of storage and energy consumption. In this paper it will be addressed the practical implementation of jail environments in real scenarios, which allows the visualization of areas where its application will be relevant and will make inevitable the redefinition of the models that are currently defined for cloud computing. In addition it will bring new opportunities in the development of support features for jail environments in the majority of operating systems.

  11. Computer algebra in systems biology

    CERN Document Server

    Laubenbacher, Reinhard


    Systems biology focuses on the study of entire biological systems rather than on their individual components. With the emergence of high-throughput data generation technologies for molecular biology and the development of advanced mathematical modeling techniques, this field promises to provide important new insights. At the same time, with the availability of increasingly powerful computers, computer algebra has developed into a useful tool for many applications. This article illustrates the use of computer algebra in systems biology by way of a well-known gene regulatory network, the Lac Operon in the bacterium E. coli.

  12. Students "Hacking" School Computer Systems (United States)

    Stover, Del


    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  13. Students "Hacking" School Computer Systems (United States)

    Stover, Del


    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  14. Current Cloud Computing Review and Cost Optimization by DERSP

    Directory of Open Access Journals (Sweden)

    M. Gomathy


    Full Text Available Cloud computing promises to deliver cost saving through the “pay as you use” paradigm. The focus is on adding computing resources when needed and releasing them when the need is serviced. Since cloud computing relies on providing computing power through multiple interconnected computers, there is a paradigm shift from one large machine to a combination of multiple smaller machine instances. In this paper, we review the current cloud computing scenario and provide a set of recommendations that can be used for designing custom applications suited for cloud deployment. We also present a comparative study on the change in cost incurred while using different combinations of machine instances for running an application on cloud; and derive the case for optimal cost

  15. Current Computational Challenges for CMC Processes, Properties, and Structures (United States)

    DiCarlo, James


    In comparison to current state-of-the-art metallic alloys, ceramic matrix composites (CMC) offer a variety of performance advantages, such as higher temperature capability (greater than the approx.2100 F capability for best metallic alloys), lower density (approx.30-50% metal density), and lower thermal expansion. In comparison to other competing high-temperature materials, CMC are also capable of providing significantly better static and dynamic toughness than un-reinforced monolithic ceramics and significantly better environmental resistance than carbon-fiber reinforced composites. Because of these advantages, NASA, the Air Force, and other U.S. government agencies and industries are currently seeking to implement these advanced materials into hot-section components of gas turbine engines for both propulsion and power generation. For applications such as these, CMC are expected to result in many important performance benefits, such as reduced component cooling air requirements, simpler component design, reduced weight, improved fuel efficiency, reduced emissions, higher blade frequencies, reduced blade clearances, and higher thrust. Although much progress has been made recently in the development of CMC constituent materials and fabrication processes, major challenges still remain for implementation of these advanced composite materials into viable engine components. The objective of this presentation is to briefly review some of those challenges that are generally related to the need to develop physics-based computational approaches to allow CMC fabricators and designers to model (1) CMC processes for fiber architecture formation and matrix infiltration, (2) CMC properties of high technical interest such as multidirectional creep, thermal conductivity, matrix cracking stress, damage accumulation, and degradation effects in aggressive environments, and (3) CMC component life times when all of these effects are interacting in a complex stress and service

  16. Robot computer problem solving system (United States)

    Becker, J. D.; Merriam, E. W.


    The conceptual, experimental, and practical aspects of the development of a robot computer problem solving system were investigated. The distinctive characteristics were formulated of the approach taken in relation to various studies of cognition and robotics. Vehicle and eye control systems were structured, and the information to be generated by the visual system is defined.

  17. Operating systems. [of computers (United States)

    Denning, P. J.; Brown, R. L.


    A counter operating system creates a hierarchy of levels of abstraction, so that at a given level all details concerning lower levels can be ignored. This hierarchical structure separates functions according to their complexity, characteristic time scale, and level of abstraction. The lowest levels include the system's hardware; concepts associated explicitly with the coordination of multiple tasks appear at intermediate levels, which conduct 'primitive processes'. Software semaphore is the mechanism controlling primitive processes that must be synchronized. At higher levels lie, in rising order, the access to the secondary storage devices of a particular machine, a 'virtual memory' scheme for managing the main and secondary memories, communication between processes by way of a mechanism called a 'pipe', access to external input and output devices, and a hierarchy of directories cataloguing the hardware and software objects to which access must be controlled.

  18. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems (United States)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.


    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and

  19. Computer System Design System-on-Chip

    CERN Document Server

    Flynn, Michael J


    The next generation of computer system designers will be less concerned about details of processors and memories, and more concerned about the elements of a system tailored to particular applications. These designers will have a fundamental knowledge of processors and other elements in the system, but the success of their design will depend on the skills in making system-level tradeoffs that optimize the cost, performance and other attributes to meet application requirements. This book provides a new treatment of computer system design, particularly for System-on-Chip (SOC), which addresses th

  20. On Dependability of Computing Systems

    Institute of Scientific and Technical Information of China (English)

    XU Shiyi


    With the rapid development and wideapplications of computing systems on which more reliance has been put, adependable system will be much more important than ever. This paper isfirst aimed at giving informal but precise definitions characterizingthe various attributes of dependability of computing systems and thenthe importance of (and the relationships among) all the attributes areexplained.Dependability is first introduced as a global concept which subsumes theusual attributes of reliability, availability, maintainability, safetyand security. The basic definitions given here are then commended andsupplemented by detailed material and additional explanations in thesubsequent sections.The presentation has been structured as follows so as to attract thereader's attention to the important attributions of dependability.* Search for a few number of concise concepts enabling thedependability attributes to be expressed as clearly as possible.* Use of terms which are identical or as close as possible tothose commonly used nowadays.This paper is also intended to provoke people's interest in designing adependable computing system.

  1. Conflict Resolution in Computer Systems

    Directory of Open Access Journals (Sweden)

    G. P. Mojarov


    shortcoming in preventing impasses is a need to have a priori information on the future demand for resources, and it is not always possible.One of ways to "struggle" against impasses when there is no a priori information on the process demand for resources is to detect deadlocks. Detection of impasses (without leading to their resolution yet is a periodical use of the algorithm which checks current distribution of resources to reveal whether there is an impasse and if it exists what processes are involved in it.The work objective is to develop methods and algorithms allowing us to minimize losses because of impasses in CS using the optimum strategy of conflict resolution. The offered approach is especially effective to eliminate deadlocks in management (control computer systems having a fixed set of programmes.The article offers a developed efficient strategy of the information processes management in multiprocessing CS, which detects and prevents impasses. The strategy is based on allocation of indivisible resources to computing processes so that losses caused by conflicts are minimized. The article studies a multi-criterion problem of indivisible resources allocation to the processes, with the optimality principle expressed by the known binary relation over set of average vectors of penalties for conflicts in each of resources. It is shown that sharing a decision theory tool and a classical one allows more efficient problem solution to eliminate deadlock. The feature of suggesting effective methods and algorithms to eliminate deadlocks is that they can be used in CS development and operation in real time. The article-given example shows that the proposed method and algorithm for the impasse resolution in multiprocessing CS are capable and promising.The offered method and algorithm provide reducing the average number of CS conflicts by 30-40 %.

  2. Computational Intelligence for Engineering Systems

    CERN Document Server

    Madureira, A; Vale, Zita


    "Computational Intelligence for Engineering Systems" provides an overview and original analysis of new developments and advances in several areas of computational intelligence. Computational Intelligence have become the road-map for engineers to develop and analyze novel techniques to solve problems in basic sciences (such as physics, chemistry and biology) and engineering, environmental, life and social sciences. The contributions are written by international experts, who provide up-to-date aspects of the topics discussed and present recent, original insights into their own experien

  3. Computing at DESY — current setup, trends and strategic directions (United States)

    Ernst, Michael


    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  4. Cloud Computing for Standard ERP Systems

    DEFF Research Database (Denmark)

    Schubert, Petra; Adisa, Femi

    Cloud Computing is a topic that has gained momentum in the last years. Current studies show that an increasing number of companies is evaluating the promised advantages and considering making use of cloud services. In this paper we investigate the phenomenon of cloud computing and its importance...... for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels...... of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda....

  5. Computers in Information Sciences: On-Line Systems. (United States)


  6. Littoral drift computations on mutual wave and current influence

    NARCIS (Netherlands)

    Bijker, E.W.


    11th Conference on Coastal Engineering in London 1968, the author presented a method for computing the littoral drift starting from the longshore current velocity as this is generated by the waves and with the assumption that the material is stirred up by the waves. In this paper measurements in a m

  7. Computed tomography: acquisition process, technology and current state

    Directory of Open Access Journals (Sweden)

    Óscar Javier Espitia Mendoza


    Full Text Available Computed tomography is a noninvasive scan technique widely applied in areas such as medicine, industry, and geology. This technique allows the three-dimensional reconstruction of the internal structure of an object which is lighted with an X-rays source. The reconstruction is formed with two-dimensional cross-sectional images of the object. Each cross-sectional is obtained from measurements of physical phenomena, such as attenuation, dispersion, and diffraction of X-rays, as result of their interaction with the object. In general, measurements acquisition is performed with methods based on any of these phenomena and according to various architectures classified in generations. Furthermore, in response to the need to simulate acquisition systems for CT, software dedicated to this task has been developed. The objective of this research is to determine the current state of CT techniques, for this, a review of methods, different architectures used for the acquisition and some of its applications is presented. Additionally, results of simulations are presented. The main contributions of this work are the detailed description of acquisition methods and the presentation of the possible trends of the technique.

  8. A New System Architecture for Pervasive Computing

    CERN Document Server

    Ismail, Anis; Ismail, Ziad


    We present new system architecture, a distributed framework designed to support pervasive computing applications. We propose a new architecture consisting of a search engine and peripheral clients that addresses issues in scalability, data sharing, data transformation and inherent platform heterogeneity. Key features of our application are a type-aware data transport that is capable of extract data, and present data through handheld devices (PDA (personal digital assistant), mobiles, etc). Pervasive computing uses web technology, portable devices, wireless communications and nomadic or ubiquitous computing systems. The web and the simple standard HTTP protocol that it is based on, facilitate this kind of ubiquitous access. This can be implemented on a variety of devices - PDAs, laptops, information appliances such as digital cameras and printers. Mobile users get transparent access to resources outside their current environment. We discuss our system's architecture and its implementation. Through experimental...

  9. Computational chemistry reviews of current trends v.4

    CERN Document Server


    This volume presents a balanced blend of methodological and applied contributions. It supplements well the first three volumes of the series, revealing results of current research in computational chemistry. It also reviews the topographical features of several molecular scalar fields. A brief discussion of topographical concepts is followed by examples of their application to several branches of chemistry.The size of a basis set applied in a calculation determines the amount of computer resources necessary for a particular task. The details of a common strategy - the ab initio model potential

  10. Aging and computational systems biology. (United States)

    Mooney, Kathleen M; Morgan, Amy E; Mc Auley, Mark T


    Aging research is undergoing a paradigm shift, which has led to new and innovative methods of exploring this complex phenomenon. The systems biology approach endeavors to understand biological systems in a holistic manner, by taking account of intrinsic interactions, while also attempting to account for the impact of external inputs, such as diet. A key technique employed in systems biology is computational modeling, which involves mathematically describing and simulating the dynamics of biological systems. Although a large number of computational models have been developed in recent years, these models have focused on various discrete components of the aging process, and to date no model has succeeded in completely representing the full scope of aging. Combining existing models or developing new models may help to address this need and in so doing could help achieve an improved understanding of the intrinsic mechanisms which underpin aging.

  11. Adaptive Fuzzy Systems in Computational Intelligence (United States)

    Berenji, Hamid R.


    In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.

  12. Computational Systems for Multidisciplinary Applications (United States)

    Soni, Bharat; Haupt, Tomasz; Koomullil, Roy; Luke, Edward; Thompson, David


    In this paper, we briefly describe our efforts to develop complex simulation systems. We focus first on four key infrastructure items: enterprise computational services, simulation synthesis, geometry modeling and mesh generation, and a fluid flow solver for arbitrary meshes. We conclude by presenting three diverse applications developed using these technologies.

  13. The current state of computing in building design and practise

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt; Andersen, Tom


    The paper outlines a general survey on computer use in the danish AEC-sector, including a detailed study of the use of knowledge-based systems. It is colcluded that the use of AI-based technology is next to nothing, simply because a lack of awareness of such technology.......The paper outlines a general survey on computer use in the danish AEC-sector, including a detailed study of the use of knowledge-based systems. It is colcluded that the use of AI-based technology is next to nothing, simply because a lack of awareness of such technology....

  14. Parallel Computational Fluid Dynamics: Current Status and Future Requirements (United States)

    Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)


    One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.

  15. A microcomputer based system for current-meter data acquisition (United States)

    Cheng, R.T.; Gartner, J.W.


    The U.S. Geological Survey is conducting current measurements as part of an interdisciplinary study of the San Francisco Bay estuarine system. The current meters used in the study record current speed, direction, temperature, and conductivity in digital codes on magnetic tape cartridges. Upon recovery of the current meters, the data tapes are translated by a tape reader into computer codes for further analyses. Quite often the importance of the data processing phase of a current-measurement program is underestimated and downplayed. In this paper a data-processing system which performs the complete data processing and analyses is described. The system, which is configured around an LSI-11 microcomputer, has been assembled to provide the capabilities of data translation, reduction, and tabulation and graphical display immediately following recovery of current meters. The flexibility inherent in a microcomputer has made it available to perform many other research functions which would normally be done on an institutional computer.

  16. Computational Aeroacoustic Analysis System Development (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.


    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  17. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid


    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  18. Redundant computing for exascale systems.

    Energy Technology Data Exchange (ETDEWEB)

    Stearley, Jon R.; Riesen, Rolf E.; Laros, James H., III; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke; Oldfield, Ron A.; Brightwell, Ronald Brian


    Exascale systems will have hundred thousands of compute nodes and millions of components which increases the likelihood of faults. Today, applications use checkpoint/restart to recover from these faults. Even under ideal conditions, applications running on more than 50,000 nodes will spend more than half of their total running time saving checkpoints, restarting, and redoing work that was lost. Redundant computing is a method that allows an application to continue working even when failures occur. Instead of each failure causing an application interrupt, multiple failures can be absorbed by the application until redundancy is exhausted. In this paper we present a method to analyze the benefits of redundant computing, present simulation results of the cost, and compare it to other proposed methods for fault resilience.

  19. Computer-aided system design (United States)

    Walker, Carrie K.


    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  20. Brain-computer interfaces current trends and applications

    CERN Document Server

    Azar, Ahmad


    The success of a BCI system depends as much on the system itself as on the user’s ability to produce distinctive EEG activity. BCI systems can be divided into two groups according to the placement of the electrodes used to detect and measure neurons firing in the brain. These groups are: invasive systems, electrodes are inserted directly into the cortex are used for single cell or multi unit recording, and electrocorticography (EcoG), electrodes are placed on the surface of the cortex (or dura); noninvasive systems, they are placed on the scalp and use electroencephalography (EEG) or magnetoencephalography (MEG) to detect neuron activity. The book is basically divided into three parts. The first part of the book covers the basic concepts and overviews of Brain Computer Interface. The second part describes new theoretical developments of BCI systems. The third part covers views on real applications of BCI systems.

  1. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L


    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  2. Computation of te distribution of current density in a system of rectangular solid conductors for alternating current by combining the separation and boundary integral equation method; Berechnung der Stromverteilung in einem System rechteckiger Massivleiter bei Wechselstrom durch Kombination der Separations- mit der Randintegralgleichungsmethode

    Energy Technology Data Exchange (ETDEWEB)

    Fichte, Lars Ole


    The author of the contribution under consideration investigates the distribution of the current density in the interior of a plurality of straight parallel conductors with a rectangular cross section after the decay of the transient switch-on processes. The cross-sectional dimensions of the electrical conductors substantially are smaller than the longitudinal extension of the arrangement. Only those conductors are investigated in which the edges of the conductor cross-sections are parallel to each other. The resulting spatially varying distributions of the current density are computed numerically. The results of the numerical computations are compared with solutions from commercial field calculation programs.

  3. A Brief Talk on Teaching Reform Program of Computer Network Course System about Computer Related Professional

    Institute of Scientific and Technical Information of China (English)

    Wang Jian-Ping; Huang Yong


    The computer network course is the mainstay required course that college computer-related professional sets up,in regard to current teaching condition analysis,the teaching of this course has not formed a complete system,the new knowledge points can be added in promptly while the outdated technology is still there in teaching The article describes the current situation and maladies which appears in the university computer network related professional teaching,the teaching systems and teaching reform schemes about the computer network coupe are presented.



    Amanuel Ayde Ergado


    In computer domain the professionals were limited in number but the numbers of institutions looking for computer professionals were high. The aim of this study is developing self learning expert system which is providing troubleshooting information about problems occurred in the computer system for the information and communication technology technicians and computer users to solve problems effectively and efficiently to utilize computer and computer related resources. Domain know...

  5. Cloud Computing for Network Security Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Jin Yang


    Full Text Available In recent years, as a new distributed computing model, cloud computing has developed rapidly and become the focus of academia and industry. But now the security issue of cloud computing is a main critical problem of most enterprise customers faced. In the current network environment, that relying on a single terminal to check the Trojan virus is considered increasingly unreliable. This paper analyzes the characteristics of current cloud computing, and then proposes a comprehensive real-time network risk evaluation model for cloud computing based on the correspondence between the artificial immune system antibody and pathogen invasion intensity. The paper also combines assets evaluation system and network integration evaluation system, considering from the application layer, the host layer, network layer may be factors that affect the network risks. The experimental results show that this model improves the ability of intrusion detection and can support for the security of current cloud computing.

  6. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn


    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  7. Computer-Based Integrated Learning Systems: Research and Theory. (United States)

    Hativa, Nira, Ed.; Becker, Henry Jay, Ed.


    The eight chapters of this theme issue discuss recent research and theory concerning computer-based integrated learning systems. Following an introduction about their theoretical background and current use in schools, the effects of using computer-based integrated learning systems in the elementary school classroom are considered. (SLD)

  8. Automated Computer Access Request System (United States)

    Snook, Bryan E.


    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  9. Nephrogenic systemic fibrosis: Current concepts

    Directory of Open Access Journals (Sweden)

    Prasanta Basak


    Full Text Available Nephrogenic systemic fibrosis (NSF was first described in 2000 as a scleromyxedema-like illness in patients on chronic hemodialysis. The relationship between NSF and gadolinium contrast during magnetic resonance imaging was postulated in 2006, and subsequently, virtually all published cases of NSF have had documented prior exposure to gadolinium-containing contrast agents. NSF has been reported in patients from a variety of ethnic backgrounds from America, Europe, Asia and Australia. Skin lesions may evolve into poorly demarcated thickened plaques that range from erythematous to hyperpigmented. With time, the skin becomes markedly indurated and tethered to the underlying fascia. Extracutaneous manifestations also occur. The diagnosis of NSF is based on the presence of characteristic clinical features in the setting of chronic kidney disease, and substantiated by skin histology. Differential diagnosis is with scleroderma, scleredema, scleromyxedema, graft-versus-host disease, etc. NSF has a relentlessly progressive course. While there is no consistently successful treatment for NSF, improving renal function seems to slow or arrest the progression of this condition. Because essentially all cases of NSF have developed following exposure to a gadolinium-containing contrast agent, prevention of this devastating condition involves the careful avoidance of administering these agents to individuals at risk.

  10. Research on computer systems benchmarking (United States)

    Smith, Alan Jay (Principal Investigator)


    This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.

  11. Computer vision in control systems

    CERN Document Server

    Jain, Lakhmi


    Volume 1 : This book is focused on the recent advances in computer vision methodologies and technical solutions using conventional and intelligent paradigms. The Contributions include: ·         Morphological Image Analysis for Computer Vision Applications. ·         Methods for Detecting of Structural Changes in Computer Vision Systems. ·         Hierarchical Adaptive KL-based Transform: Algorithms and Applications. ·         Automatic Estimation for Parameters of Image Projective Transforms Based on Object-invariant Cores. ·         A Way of Energy Analysis for Image and Video Sequence Processing. ·         Optimal Measurement of Visual Motion Across Spatial and Temporal Scales. ·         Scene Analysis Using Morphological Mathematics and Fuzzy Logic. ·         Digital Video Stabilization in Static and Dynamic Scenes. ·         Implementation of Hadamard Matrices for Image Processing. ·         A Generalized Criterion ...

  12. When does a physical system compute? (United States)

    Horsman, Clare; Stepney, Susan; Wagner, Rob C; Kendon, Viv


    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a 'computational entity', and its critical role in defining when computing is taking place in physical systems.

  13. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  14. Computing abstractions of nonlinear systems

    CERN Document Server

    Reißig, Gunther


    We present an efficient algorithm for computing discrete abstractions of arbitrary memory span for nonlinear discrete-time and sampled systems, in which, apart from possibly numerically integrating ordinary differential equations, the only nontrivial operation to be performed repeatedly is to distinguish empty from non-empty convex polyhedra. We also provide sufficient conditions for the convexity of attainable sets, which is an important requirement for the correctness of the method we propose. It turns out that requirement can be met under rather mild conditions, which essentially reduce to sufficient smoothness in the case of sampled systems. Practicability of our approach in the design of discrete controllers for continuous plants is demonstrated by an example.

  15. Hydronic distribution system computer model

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, J.W.; Strasser, J.J.


    A computer model of a hot-water boiler and its associated hydronic thermal distribution loop has been developed at Brookhaven National Laboratory (BNL). It is intended to be incorporated as a submodel in a comprehensive model of residential-scale thermal distribution systems developed at Lawrence Berkeley. This will give the combined model the capability of modeling forced-air and hydronic distribution systems in the same house using the same supporting software. This report describes the development of the BNL hydronics model, initial results and internal consistency checks, and its intended relationship to the LBL model. A method of interacting with the LBL model that does not require physical integration of the two codes is described. This will provide capability now, with reduced up-front cost, as long as the number of runs required is not large.

  16. Computer systems and software engineering (United States)

    Mckay, Charles W.


    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  17. Current and Future Flight Operating Systems (United States)

    Cudmore, Alan


    This viewgraph presentation reviews the current real time operating system (RTOS) type in use with current flight systems. A new RTOS model is described, i.e. the process model. Included is a review of the challenges of migrating from the classic RTOS to the Process Model type.

  18. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL


    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  19. Using Expert Systems For Computational Tasks (United States)

    Duke, Eugene L.; Regenie, Victoria A.; Brazee, Marylouise; Brumbaugh, Randal W.


    Transformation technique enables inefficient expert systems to run in real time. Paper suggests use of knowledge compiler to transform knowledge base and inference mechanism of expert-system computer program into conventional computer program. Main benefit, faster execution and reduced processing demands. In avionic systems, transformation reduces need for special-purpose computers.

  20. Software For Monitoring VAX Computer Systems (United States)

    Farkas, Les; Don, Ken; Lavery, David; Baron, Amy


    VAX Continuous Monitoring System (VAXCMS) computer program developed at NASA Headquarters to aid system managers in monitoring performances of VAX computer systems through generation of graphic images summarizing trends in performance metrics over time. VAXCMS written in DCL and VAX FORTRAN for use with DEC VAX-series computers running VMS 5.1 or later.

  1. Computer Aided Control System Design (CACSD) (United States)

    Stoner, Frank T.


    The design of modern aerospace systems relies on the efficient utilization of computational resources and the availability of computational tools to provide accurate system modeling. This research focuses on the development of a computer aided control system design application which provides a full range of stability analysis and control design capabilities for aerospace vehicles.

  2. Information Systems: Current Developments and Future Expansion. (United States)


    On May 20, 1970, a one-day seminar was held for Congressional members and staff. The papers given at this seminar and included in the proceedings are: (1) "Understanding Information Systems" by J. D. Aron, (2) "Computer Applications in Political Science" by Kenneth Janda, (3) "Who's the Master of Your Information System?" by Marvin Kornbluh, (4)…

  3. Impact of new computing systems on finite element computations (United States)

    Noor, A. K.; Storassili, O. O.; Fulton, R. E.


    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified.

  4. Transient Faults in Computer Systems (United States)

    Masson, Gerald M.


    A powerful technique particularly appropriate for the detection of errors caused by transient faults in computer systems was developed. The technique can be implemented in either software or hardware; the research conducted thus far primarily considered software implementations. The error detection technique developed has the distinct advantage of having provably complete coverage of all errors caused by transient faults that affect the output produced by the execution of a program. In other words, the technique does not have to be tuned to a particular error model to enhance error coverage. Also, the correctness of the technique can be formally verified. The technique uses time and software redundancy. The foundation for an effective, low-overhead, software-based certification trail approach to real-time error detection resulting from transient fault phenomena was developed.

  5. Computer system reliability safety and usability

    CERN Document Server

    Dhillon, BS


    Computer systems have become an important element of the world economy, with billions of dollars spent each year on development, manufacture, operation, and maintenance. Combining coverage of computer system reliability, safety, usability, and other related topics into a single volume, Computer System Reliability: Safety and Usability eliminates the need to consult many different and diverse sources in the hunt for the information required to design better computer systems.After presenting introductory aspects of computer system reliability such as safety, usability-related facts and figures,

  6. Computational approaches for systems metabolomics. (United States)

    Krumsiek, Jan; Bartel, Jörg; Theis, Fabian J


    Systems genetics is defined as the simultaneous assessment and analysis of multi-omics datasets. In the past few years, metabolomics has been established as a robust tool describing an important functional layer in this approach. The metabolome of a biological system represents an integrated state of genetic and environmental factors and has been referred to as a 'link between genotype and phenotype'. In this review, we summarize recent progresses in statistical analysis methods for metabolomics data in combination with other omics layers. We put a special focus on complex, multivariate statistical approaches as well as pathway-based and network-based analysis methods. Moreover, we outline current challenges and pitfalls of metabolomics-focused multi-omics analyses and discuss future steps for the field.

  7. Integrated Computer System of Management in Logistics (United States)

    Chwesiuk, Krzysztof


    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  8. Thermal currents in highly correlated systems


    MORENO, J; Coleman, P.


    Conventional approaches to thermal conductivity in itinerant systems neglect the contribution to thermal current due to interactions. We derive this contribution to the thermal current and show how it produces important corrections to the thermal conductivity in anisotropic superconductors. We discuss the possible relevance of these corrections for the interpretation of the thermal conductivity of anisotropic superconductors.

  9. Multiple Currents in the Gulf Stream System


    Fuglister, F. C.


    A new interpretation of the accumulated temperature and salinity data from the Gulf Stream Area indicates that the System is made up of a series of overlapping currents. These currents are separated by relatively weak countercurrents. Data from a recent survey are presented as supporting this hypothesis.DOI: 10.1111/j.2153-3490.1951.tb00804.x

  10. Current frontiers in systemic sclerosis pathogenesis

    NARCIS (Netherlands)

    Ciechomska, Marzena; van Laar, Jacob; O'Reilly, Steven


    Systemic sclerosis is an autoimmune disease characterised by vascular dysfunction, impaired angiogenesis, inflammation and fibrosis. There is no currently accepted disease-modifying treatment with only autologous stem cell transplant showing clinically meaningful benefit. The lack of treatment optio

  11. Implementation of Computational Electromagnetic on Distributed Systems

    Institute of Scientific and Technical Information of China (English)


    Now the new generation of technology could raise the bar for distributed computing. It seems to be a trend to solve computational electromagnetic work on a distributed system with parallel computing techniques. In this paper, we analyze the parallel characteristics of the distributed system and the possibility of setting up a tightly coupled distributed system by using LAN in our lab. The analysis of the performance of different computational methods, such as FEM, MOM, FDTD and finite difference method, are given. Our work on setting up a distributed system and the performance of the test bed is also included. At last, we mention the implementation of one of our computational electromagnetic codes.

  12. Superpersistent Currents in Dirac Fermion Systems (United States)


    TITLE AND SUBTITLE Superpersistent Currents in Dirac Fermion Systems 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-15-1-0151 5c.   PROGRAM ELEMENT...currents in 2D Dirac material systems and pertinent phenomena in the emerging field of relativistic quantum nonlinear dynamics and chaos. Systematic...anomalous optical transitions, and spin control in topological insulator quantum dots, (4) the discovery of nonlinear dynamics induced anomalous Hall

  13. Performance Aspects of Synthesizable Computing Systems

    DEFF Research Database (Denmark)

    Schleuniger, Pascal

    . However, high setup and design costs make ASICs economically viable only for high volume production. Therefore, FPGAs are increasingly being used in low and medium volume markets. The evolution of FPGAs has reached a point where multiple processor cores, dedicated accelerators, and a large number...... of interfaces can be integrated on a single device. This thesis consists of ve parts that address performance aspects of synthesizable computing systems on FPGAs. First, it is evaluated how synthesizable processor cores can exploit current state-of-the-art FPGA architectures. This evaluation results...... in a processor architecture optimized for a high throughput on modern FPGA architectures. The current hardware implementation, the Tinuso I core, can be clocked as high as 376MHz on a Xilinx Virtex 6 device and consumes fewer hardware resources than similar commercial processor congurations. The Tinuso...

  14. Computer-assisted Orthopaedic Surgery: Current State and Future Perspective

    Directory of Open Access Journals (Sweden)

    Guoyan eZheng


    Full Text Available Introduced about two decades ago, computer-assisted orthopaedic surgery (CAOS has emerged as a new and independent area, due to the importance of treatment of musculoskeletal diseases in orthopaedics and traumatology, increasing availability of different imaging modalities, and advances in analytics and navigation tools. The aim of this paper is to present the basic elements of CAOS devices and to review state-of-the-art examples of different imaging modalities used to create the virtual representations, of different position tracking devices for navigation systems, of different surgical robots, of different methods for registration and referencing, and of CAOS modules that have been realized for different surgical procedures. Future perspectives will also be outlined.

  15. Cybersecurity of embedded computers systems


    Carlioz, Jean


    International audience; Several articles have recently raised the issue of computer security of commercial flights by evoking the "connected aircraft, hackers target" or "Wi-Fi on planes, an open door for hackers ? " Or "Can you hack the computer of an Airbus or a Boeing ?". The feared scenario consists in a takeover of operational aircraft software that intentionally cause an accident. Moreover, several computer security experts have lately announced they had detected flaws in embedded syste...

  16. P300 brain computer interface: current challenges and emerging trends

    Directory of Open Access Journals (Sweden)

    Reza eFazel-Rezai


    Full Text Available A brain-computer interface (BCI enables communication without movement based on brain signals measured with electroencephalography (EEG. BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP, steady state visual evoked potential (SSVEP, or event related desynchronization (ERD. Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the event-related potential (ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility.

  17. Applied computation and security systems

    CERN Document Server

    Saeed, Khalid; Choudhury, Sankhayan; Chaki, Nabendu


    This book contains the extended version of the works that have been presented and discussed in the First International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2014) held during April 18-20, 2014 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland and University of Calcutta, India. The Volume I of this double-volume book contains fourteen high quality book chapters in three different parts. Part 1 is on Pattern Recognition and it presents four chapters. Part 2 is on Imaging and Healthcare Applications contains four more book chapters. The Part 3 of this volume is on Wireless Sensor Networking and it includes as many as six chapters. Volume II of the book has three Parts presenting a total of eleven chapters in it. Part 4 consists of five excellent chapters on Software Engineering ranging from cloud service design to transactional memory. Part 5 in Volume II is on Cryptography with two book...

  18. Universal blind quantum computation for hybrid system (United States)

    Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang


    As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.

  19. Computer Simulation and Computabiblity of Biological Systems

    CERN Document Server

    Baianu, I C


    The ability to simulate a biological organism by employing a computer is related to the ability of the computer to calculate the behavior of such a dynamical system, or the "computability" of the system. However, the two questions of computability and simulation are not equivalent. Since the question of computability can be given a precise answer in terms of recursive functions, automata theory and dynamical systems, it will be appropriate to consider it first. The more elusive question of adequate simulation of biological systems by a computer will be then addressed and a possible connection between the two answers given will be considered as follows. A symbolic, algebraic-topological "quantum computer" (as introduced in Baianu, 1971b) is here suggested to provide one such potential means for adequate biological simulations based on QMV Quantum Logic and meta-Categorical Modeling as for example in a QMV-based, Quantum-Topos (Baianu and Glazebrook,2004.

  20. Medical Robots: Current Systems and Research Directions


    Beasley, Ryan A.


    First used medically in 1985, robots now make an impact in laparoscopy, neurosurgery, orthopedic surgery, emergency response, and various other medical disciplines. This paper provides a review of medical robot history and surveys the capabilities of current medical robot systems, primarily focusing on commercially available systems while covering a few prominent research projects. By examining robotic systems across time and disciplines, trends are discernible that imply future capabilities ...

  1. The Computational Complexity of Evolving Systems

    NARCIS (Netherlands)

    Verbaan, P.R.A.


    Evolving systems are systems that change over time. Examples of evolving systems are computers with soft-and hardware upgrades and dynamic networks of computers that communicate with each other, but also colonies of cooperating organisms or cells within a single organism. In this research, several m

  2. Large-scale neuromorphic computing systems (United States)

    Furber, Steve


    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  3. Computational Models for Nonlinear Aeroelastic Systems Project (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  4. ACSES, An Automated Computer Science Education System. (United States)

    Nievergelt, Jurg; And Others

    A project to accommodate the large and increasing enrollment in introductory computer science courses by automating them with a subsystem for computer science instruction on the PLATO IV Computer-Based Education system at the University of Illinois was started. The subsystem was intended to be used for supplementary instruction at the University…

  5. On-line current feed and computer aided control tactics for automatic balancing head

    Institute of Scientific and Technical Information of China (English)


    In the designed automatic balancing head,a non-contact induction transformer is used to deliver driving energy to solve the problem of current fed and controlling on-line.Computer controlled automatic balancing experiments with phase-magnitude control tactics were performed on a flexible rotor system.Results of the experiments prove that the energy feeding method and the control tactics are effective in the automatic balancing head for vibration controlling.

  6. Current experience with computed tomographic cystography and blunt trauma. (United States)

    Deck, A J; Shaves, S; Talner, L; Porter, J R


    We present our experience with computed tomographic (CT) cystography for the diagnosis of bladder rupture in patients with blunt abdominal and pelvic trauma and compare the results of CT cystography to operative exploration. We identified all blunt trauma patients diagnosed with bladder rupture from January 1992 to September 1998. We also reviewed the radiology computerized information system (RIS) for all CT cystograms performed for the evaluation of blunt trauma during the same time period. The medical records and pertinent radiographs of the patients with bladder rupture who underwent CT cystography as part of their admission evaluation were reviewed. Operative findings were compared to radiographic findings. Altogether, 316 patients had CT cystograms as part of an initial evaluation for blunt trauma. Of these patients, 44 had an ultimate diagnosis of bladder rupture; 42 patients had CT cystograms indicating bladder rupture. A total of 28 patients underwent formal bladder exploration; 23 (82%) had operative findings that exactly (i.e., presence and type of rupture) matched the CT cystogram interpretation. The overall sensitivity and specificity of CT cystography for detection of bladder rupture were 95% and 100%, respectively. For intraperitoneal rupture, the sensitivity and specificity were 78% and 99%, respectively. CT cystography provides an expedient evaluation for bladder rupture caused by blunt trauma and has an accuracy comparable to that reported for plain film cystography. We recommend CT cystography over plain film cystography for patients undergoing CT evaluation for other blunt trauma-related injuries.

  7. Task allocation in a distributed computing system (United States)

    Seward, Walter D.


    A conceptual framework is examined for task allocation in distributed systems. Application and computing system parameters critical to task allocation decision processes are discussed. Task allocation techniques are addressed which focus on achieving a balance in the load distribution among the system's processors. Equalization of computing load among the processing elements is the goal. Examples of system performance are presented for specific applications. Both static and dynamic allocation of tasks are considered and system performance is evaluated using different task allocation methodologies.

  8. Distributed computer systems theory and practice

    CERN Document Server

    Zedan, H S M


    Distributed Computer Systems: Theory and Practice is a collection of papers dealing with the design and implementation of operating systems, including distributed systems, such as the amoeba system, argus, Andrew, and grapevine. One paper discusses the concepts and notations for concurrent programming, particularly language notation used in computer programming, synchronization methods, and also compares three classes of languages. Another paper explains load balancing or load redistribution to improve system performance, namely, static balancing and adaptive load balancing. For program effici

  9. Comparing the architecture of Grid Computing and Cloud Computing systems

    Directory of Open Access Journals (Sweden)

    Abdollah Doavi


    Full Text Available Grid Computing or computational connected networks is a new network model that allows the possibility of massive computational operations using the connected resources, in fact, it is a new generation of distributed networks. Grid architecture is recommended because the widespread nature of the Internet makes an exciting environment called 'Grid' to create a scalable system with high-performance, generalized and secure. Then the central architecture called to this goal is a firmware named GridOS. The term 'cloud computing' means the development and deployment of Internet –based computing technology. This is a style of computing in an environment where IT-related capabilities offered as a service or users services. And it allows him/her to have access to technology-based services on the Internet; without the user having the specific information about this technology or (s he wants to take control of the IT infrastructure supported by him/her. In the paper, general explanations are given about the systems Grid and Cloud. Then their provided components and services are checked by these systems and their security.

  10. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal


    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  11. Potential of Cognitive Computing and Cognitive Systems (United States)

    Noor, Ahmed K.


    Cognitive computing and cognitive technologies are game changers for future engineering systems, as well as for engineering practice and training. They are major drivers for knowledge automation work, and the creation of cognitive products with higher levels of intelligence than current smart products. This paper gives a brief review of cognitive computing and some of the cognitive engineering systems activities. The potential of cognitive technologies is outlined, along with a brief description of future cognitive environments, incorporating cognitive assistants - specialized proactive intelligent software agents designed to follow and interact with humans and other cognitive assistants across the environments. The cognitive assistants engage, individually or collectively, with humans through a combination of adaptive multimodal interfaces, and advanced visualization and navigation techniques. The realization of future cognitive environments requires the development of a cognitive innovation ecosystem for the engineering workforce. The continuously expanding major components of the ecosystem include integrated knowledge discovery and exploitation facilities (incorporating predictive and prescriptive big data analytics); novel cognitive modeling and visual simulation facilities; cognitive multimodal interfaces; and cognitive mobile and wearable devices. The ecosystem will provide timely, engaging, personalized / collaborative, learning and effective decision making. It will stimulate creativity and innovation, and prepare the participants to work in future cognitive enterprises and develop new cognitive products of increasing complexity.

  12. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi


    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  13. Computer simulation of transport driven current in tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Nunan, W.J.; Dawson, J.M. (University of California at Los Angeles, Department of Physics, 405 Hilgard Avenue, Los Angeles, California 90024-1547 (United States))


    We have investigated transport driven current in tokamaks via 2+1/2 dimensional, electromagnetic, particle-in-cell simulations. These have demonstrated a steady increase of toroidal current in centrally fueled plasmas. Neoclassical theory predicts that the bootstrap current vanishes at large aspect ratio, but we see equal or greater current growth in straight cylindrical plasmas. These results indicate that a centrally fueled and heated tokamak may sustain its toroidal current, even without the seed current'' which the neoclassical bootstrap theory requires.

  14. Automatic system for ionization chamber current measurements. (United States)

    Brancaccio, Franco; Dias, Mauro S; Koskinas, Marina F


    The present work describes an automatic system developed for current integration measurements at the Laboratório de Metrologia Nuclear of Instituto de Pesquisas Energéticas e Nucleares. This system includes software (graphic user interface and control) and a module connected to a microcomputer, by means of a commercial data acquisition card. Measurements were performed in order to check the performance and for validating the proposed design.

  15. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S


    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  16. Direct current power delivery system and method (United States)

    Zhang, Di; Garces, Luis Jose; Dai, Jian; Lai, Rixin


    A power transmission system includes a first unit for carrying out the steps of receiving high voltage direct current (HVDC) power from an HVDC power line, generating an alternating current (AC) component indicative of a status of the first unit, and adding the AC component to the HVDC power line. Further, the power transmission system includes a second unit for carrying out the steps of generating a direct current (DC) voltage to transfer the HVDC power on the HVDC power line, wherein the HVDC power line is coupled between the first unit and the second unit, detecting a presence or an absence of the added AC component in the HVDC power line, and determining the status of the first unit based on the added AC component.

  17. Computer Simulation of Transport Driven Current in Tokamaks (United States)

    Nunan, William Joseph, III


    Plasma transport phenomena can drive large currents parallel to an externally applied magnetic field. The Bootstrap Current Theory accounts for the effect of Banana Diffusion on toroidal current, but the effect is not confined to that transport regime, or even to toroidal geometry. Our electromagnetic particle simulations have demonstrated that Maxwellian plasmas in static toroidal and vertical fields spontaneously develop significant toroidal current, even in the absence of the "seed current" which the Bootstrap Theory requires. Other simulations, in both cylindrical and toroidal geometries, and without any externally imposed electric field, show that if the plasma column is centrally fueled, then an initial toroidal current grows steadily, apparently due to a dynamo effect. The straight cylinder does not exhibit kink instabilities because k_ {z} = 0 in this 2 + 1/2 dimensional model. When the plasma is fueled at the edge rather than the center, the effect is diminished. Fueling at an intermediate radius should produce a level of current drive in between these two limits, because the key to the current drive seems to be the amount of total poloidal flux which the plasma crosses in the process of escaping. In a reactor, injected (cold) fuel ions must reach the center, and be heated up in order to burn; therefore, central fueling is needed anyway, and the resulting influx of cold plasma and outflux of hot plasma drives the toroidal current. Our simulations indicate that central fueling, coupled with the central heating due to fusion reactions may provide all of the required toroidal current. The Neoclassical Theory predicts that the Bootstrap Current approaches zero as the aspect ratio approaches infinity; however, in straight cylindrical plasma simulations, axial current increases over time at nearly the same rate as in the toroidal case. These results indicate that a centrally fueled and heated tokamak may sustain its own toroidal current, even in the absence of

  18. Formal Protection Architecture for Cloud Computing System

    Institute of Scientific and Technical Information of China (English)

    Yasha Chen; Jianpeng Zhao; Junmao Zhu; Fei Yan


    Cloud computing systems play a vital role in national securi-ty. This paper describes a conceptual framework called dual-system architecture for protecting computing environments. While attempting to be logical and rigorous, formalism meth-od is avoided and this paper chooses algebra Communication Sequential Process.

  19. Computer Literacy in a Distance Education System (United States)

    Farajollahi, Mehran; Zandi, Bahman; Sarmadi, Mohamadreza; Keshavarz, Mohsen


    In a Distance Education (DE) system, students must be equipped with seven skills of computer (ICDL) usage. This paper aims at investigating the effect of a DE system on the computer literacy of Master of Arts students at Tehran University. The design of this study is quasi-experimental. Pre-test and post-test were used in both control and…

  20. Computer-Controlled, Motorized Positioning System (United States)

    Vargas-Aburto, Carlos; Liff, Dale R.


    Computer-controlled, motorized positioning system developed for use in robotic manipulation of samples in custom-built secondary-ion mass spectrometry (SIMS) system. Positions sample repeatably and accurately, even during analysis in three linear orthogonal coordinates and one angular coordinate under manual local control, or microprocessor-based local control or remote control by computer via general-purpose interface bus (GPIB).

  1. Advanced Hybrid Computer Systems. Software Technology. (United States)

    This software technology final report evaluates advances made in Advanced Hybrid Computer System software technology . The report describes what...automatic patching software is available as well as which analog/hybrid programming languages would be most feasible for the Advanced Hybrid Computer...compiler software . The problem of how software would interface with the hybrid system is also presented.

  2. Biomolecular computing systems: principles, progress and potential. (United States)

    Benenson, Yaakov


    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  3. Computer system for monitoring power boiler operation

    Energy Technology Data Exchange (ETDEWEB)

    Taler, J.; Weglowski, B.; Zima, W.; Duda, P.; Gradziel, S.; Sobota, T.; Cebula, A.; Taler, D. [Cracow University of Technology, Krakow (Poland). Inst. for Process & Power Engineering


    The computer-based boiler performance monitoring system was developed to perform thermal-hydraulic computations of the boiler working parameters in an on-line mode. Measurements of temperatures, heat flux, pressures, mass flowrates, and gas analysis data were used to perform the heat transfer analysis in the evaporator, furnace, and convection pass. A new construction technique of heat flux tubes for determining heat flux absorbed by membrane water-walls is also presented. The current paper presents the results of heat flux measurement in coal-fired steam boilers. During changes of the boiler load, the necessary natural water circulation cannot be exceeded. A rapid increase of pressure may cause fading of the boiling process in water-wall tubes, whereas a rapid decrease of pressure leads to water boiling in all elements of the boiler's evaporator - water-wall tubes and downcomers. Both cases can cause flow stagnation in the water circulation leading to pipe cracking. Two flowmeters were assembled on central downcomers, and an investigation of natural water circulation in an OP-210 boiler was carried out. On the basis of these measurements, the maximum rates of pressure change in the boiler evaporator were determined. The on-line computation of the conditions in the combustion chamber allows for real-time determination of the heat flowrate transferred to the power boiler evaporator. Furthermore, with a quantitative indication of surface cleanliness, selective sootblowing can be directed at specific problem areas. A boiler monitoring system is also incorporated to provide details of changes in boiler efficiency and operating conditions following sootblowing, so that the effects of a particular sootblowing sequence can be analysed and optimized at a later stage.


    Directory of Open Access Journals (Sweden)

    S. R. Tajane et al.


    Full Text Available The purpose for this review on pulsatile drug delivery systems (PDDS is to compile the recent literatures with special focus on the different types and approaches involved in the development of the formulation. Pulsatile drug delivery system is the most interesting time and site-specific system. This system is designed for chronopharmacotherapy. Thus, to mimic the function of living systems and in view of emerging chronotherapeutic approaches, pulsatile delivery, which is meant to release a drug following programmed lag phase, has increasing interest in the recent years. Diseases wherein PDDS are promising include asthma, peptic ulcer, cardiovascular diseases, arthritis, and attention deficit syndrome in children, cancer, diabetes, and hypercholesterolemia. Pulsatile drug delivery system divided into 2 types’ preplanned systems and stimulus induced system, preplanned systems based on osmosis, rupturable layers, and erodible barrier coatings. Stimuli induced system based on electrical, temperature and chemically induced systems. This review also summarizes some current PDDS already available in the market. These systems are useful to several problems encountered during the development of a pharmaceutical dosage form.

  5. Automated Diversity in Computer Systems (United States)


    P ( EBM I ) = Me2a ; P (ELMP ) = ps and P (EBMP ) = ps. We are interested in the probability of a successful branch (escape) out of a sequence of n...reference is still le- gal. Both can generate false positives, although CRED is less computationally expensive. The common theme in all these

  6. Positron Computed Tomography: Current State, Clinical Results and Future Trends (United States)

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.


    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  7. Ubiquitous Wireless Computing: Current Research Progress, Challenging, and Future Directions


    Elyas, Palantei


    - The aggressive research activities and generous studies focusing on the ubiquitous mobile computing carried-out during the last two decades have gained very tremendous outcomes to apply in broad areas of modern society lives. In the near future, the computing technology application is highly possible to emerge as the dominant method to connect any objects to the global ICT infrastructure, the internet. This talk mainly discusses several R&D achievements performed during the last five yea...

  8. Medical Robots: Current Systems and Research Directions

    Directory of Open Access Journals (Sweden)

    Ryan A. Beasley


    Full Text Available First used medically in 1985, robots now make an impact in laparoscopy, neurosurgery, orthopedic surgery, emergency response, and various other medical disciplines. This paper provides a review of medical robot history and surveys the capabilities of current medical robot systems, primarily focusing on commercially available systems while covering a few prominent research projects. By examining robotic systems across time and disciplines, trends are discernible that imply future capabilities of medical robots, for example, increased usage of intraoperative images, improved robot arm design, and haptic feedback to guide the surgeon.

  9. Laser Imaging Systems For Computer Vision (United States)

    Vlad, Ionel V.; Ionescu-Pallas, Nicholas; Popa, Dragos; Apostol, Ileana; Vlad, Adriana; Capatina, V.


    The computer vision is becoming an essential feature of the high level artificial intelligence. Laser imaging systems act as special kind of image preprocessors/converters enlarging the access of the computer "intelligence" to the inspection, analysis and decision in new "world" : nanometric, three-dimensionals(3D), ultrafast, hostile for humans etc. Considering that the heart of the problem is the matching of the optical methods and the compu-ter software , some of the most promising interferometric,projection and diffraction systems are reviewed with discussions of our present results and of their potential in the precise 3D computer vision.

  10. Current trends on knowledge-based systems

    CERN Document Server

    Valencia-García, Rafael


    This book presents innovative and high-quality research on the implementation of conceptual frameworks, strategies, techniques, methodologies, informatics platforms and models for developing advanced knowledge-based systems and their application in different fields, including Agriculture, Education, Automotive, Electrical Industry, Business Services, Food Manufacturing, Energy Services, Medicine and others. Knowledge-based technologies employ artificial intelligence methods to heuristically address problems that cannot be solved by means of formal techniques. These technologies draw on standard and novel approaches from various disciplines within Computer Science, including Knowledge Engineering, Natural Language Processing, Decision Support Systems, Artificial Intelligence, Databases, Software Engineering, etc. As a combination of different fields of Artificial Intelligence, the area of Knowledge-Based Systems applies knowledge representation, case-based reasoning, neural networks, Semantic Web and TICs used...

  11. Computer Bits: The Ideal Computer System for Your Center. (United States)

    Brown, Dennis; Neugebauer, Roger


    Reviews five computer systems that can address the needs of a child care center: (1) Sperry PC IT with Bernoulli Box, (2) Compaq DeskPro 286, (3) Macintosh Plus, (4) Epson Equity II, and (5) Leading Edge Model "D." (HOD)

  12. An Optical Tri-valued Computing System

    Directory of Open Access Journals (Sweden)

    Junjie Peng


    Full Text Available A new optical computing experimental system is presented. Designed based on tri-valued logic, the system is built as a photoelectric hybrid computer system which is much advantageous over its electronic counterparts. Say, the tri-valued logic of the system guarantees that it is more powerful in information processing than that of systems with binary logic. And the optical characteristic of the system makes it be much capable in huge data processing than that of the electronic computers. The optical computing system includes two parts, electronic part and optical part. The electronic part consists of a PC and two embedded systems which are used for data input/output, monitor, synchronous control, user data combination and separation and so on. The optical part includes three components. They are optical encoder, logic calculator and decoder. It mainly answers for encoding the users' requests into tri-valued optical information, computing and processing the requests, decoding the tri-valued optical information to binary electronic information and so forth. Experiment results show that the system is quite right in optical information processing which demonstrates the feasibility and correctness of the optical computing system.

  13. Hybrid Systems: Computation and Control. (United States)


    elbow) and a pinned first joint (shoul- der) (see Figure 2); it is termed an underactuated system since it is a mechanical system with fewer...Montreal, PQ, Canada, 1998. [10] M. W. Spong. Partial feedback linearization of underactuated mechanical systems . In Proceedings, IROS󈨢, pages 314-321...control mechanism and search for optimal combinations of control variables. Besides the nonlinear and hybrid nature of powertrain systems , hardware

  14. Review of Current Nuclear Vacuum System Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, M.; McCracken, J.; Shope, T.


    Nearly all industrial operations generate unwanted dust, particulate matter, and/or liquid wastes. Waste dust and particulates can be readily tracked to other work locations, and airborne particulates can be spread through ventilation systems to all locations within a building, and even vented outside the building - a serious concern for processes involving hazardous, radioactive, or nuclear materials. Several varieties of vacuum systems have been proposed and/or are commercially available for clean up of both solid and liquid hazardous and nuclear materials. A review of current technologies highlights both the advantages and disadvantages of the various systems, and demonstrates the need for a system designed to address issues specific to hazardous and nuclear material cleanup. A review of previous and current hazardous/nuclear material cleanup technologies is presented. From simple conventional vacuums modified for use in industrial operations, to systems specifically engineered for such purposes, the advantages and disadvantages are examined in light of the following criteria: minimal worker exposure; minimal secondary waste generation;reduced equipment maintenance and consumable parts; simplicity of design, yet fully compatible with all waste types; and ease of use. The work effort reviews past, existing and proposed technologies in light of such considerations. Accomplishments of selected systems are presented, including identified areas where technological improvements could be suggested.

  15. MTA Computer Based Evaluation System. (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  16. MTA Computer Based Evaluation System. (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  17. Computer Jet-Engine-Monitoring System (United States)

    Disbrow, James D.; Duke, Eugene L.; Ray, Ronald J.


    "Intelligent Computer Assistant for Engine Monitoring" (ICAEM), computer-based monitoring system intended to distill and display data on conditions of operation of two turbofan engines of F-18, is in preliminary state of development. System reduces burden on propulsion engineer by providing single display of summary information on statuses of engines and alerting engineer to anomalous conditions. Effective use of prior engine-monitoring system requires continuous attention to multiple displays.

  18. A computational system for a Mars rover (United States)

    Lambert, Kenneth E.


    This paper presents an overview of an onboard computing system that can be used for meeting the computational needs of a Mars rover. The paper begins by presenting an overview of some of the requirements which are key factors affecting the architecture. The rest of the paper describes the architecture. Particular emphasis is placed on the criteria used in defining the system and how the system qualitatively meets the criteria.

  19. Computer Jet-Engine-Monitoring System (United States)

    Disbrow, James D.; Duke, Eugene L.; Ray, Ronald J.


    "Intelligent Computer Assistant for Engine Monitoring" (ICAEM), computer-based monitoring system intended to distill and display data on conditions of operation of two turbofan engines of F-18, is in preliminary state of development. System reduces burden on propulsion engineer by providing single display of summary information on statuses of engines and alerting engineer to anomalous conditions. Effective use of prior engine-monitoring system requires continuous attention to multiple displays.

  20. Intelligent computational systems for space applications (United States)

    Lum, Henry, Jr.; Lau, Sonie


    The evolution of intelligent computation systems is discussed starting with the Spaceborne VHSIC Multiprocessor System (SVMS). The SVMS is a six-processor system designed to provide at least a 100-fold increase in both numeric and symbolic processing over the i386 uniprocessor. The significant system performance parameters necessary to achieve the performance increase are discussed.

  1. A Neuron Model Based Ultralow Current Sensor System for Bioapplications

    Directory of Open Access Journals (Sweden)

    A. K. M. Arifuzzman


    Full Text Available An ultralow current sensor system based on the Izhikevich neuron model is presented in this paper. The Izhikevich neuron model has been used for its superior computational efficiency and greater biological plausibility over other well-known neuron spiking models. Of the many biological neuron spiking features, regular spiking, chattering, and neostriatal spiny projection spiking have been reproduced by adjusting the parameters associated with the model at hand. This paper also presents a modified interpretation of the regular spiking feature in which the firing pattern is similar to that of the regular spiking but with improved dynamic range offering. The sensor current ranges between 2 pA and 8 nA and exhibits linearity in the range of 0.9665 to 0.9989 for different spiking features. The efficacy of the sensor system in detecting low amount of current along with its high linearity attribute makes it very suitable for biomedical applications.

  2. Human computer interaction issues in Clinical Trials Management Systems. (United States)

    Starren, Justin B; Payne, Philip R O; Kaufman, David R


    Clinical trials increasingly rely upon web-based Clinical Trials Management Systems (CTMS). As with clinical care systems, Human Computer Interaction (HCI) issues can greatly affect the usefulness of such systems. Evaluation of the user interface of one web-based CTMS revealed a number of potential human-computer interaction problems, in particular, increased workflow complexity associated with a web application delivery model and potential usability problems resulting from the use of ambiguous icons. Because these design features are shared by a large fraction of current CTMS, the implications extend beyond this individual system.

  3. Computation of Weapons Systems Effectiveness (United States)


    Aircraft Dive Angle : Initial Weapon Release Velocity at x-axis VOx VOz x: x-axis z: z-axis : Initial Weapon Release Velocity at z...altitude Impact Velocity (x− axis), Vix = VOx (3.4) Impact Velocity (z− axis), Viz = VOz + (g ∗ TOF) (3.5) Impact Velocity, Vi = �Vix2 + Viz2 (3.6...compute the ballistic partials to examine the effects that varying h, VOx and VOz have on RB using the following equations: ∂RB ∂h = New RB−Old RB

  4. A cost modelling system for cloud computing


    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh


    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  5. The university computer network security system

    Institute of Scientific and Technical Information of China (English)



    With the development of the times, advances in technology, computer network technology has been deep into all aspects of people's lives, it plays an increasingly important role, is an important tool for information exchange. Colleges and universities is to cultivate the cradle of new technology and new technology, computer network Yulu nectar to nurture emerging technologies, and so, as institutions of higher learning should pay attention to the construction of computer network security system.


    Directory of Open Access Journals (Sweden)

    Vladimir Hahanov


    Full Text Available Qubit models and methods for improving the performance of software and hardware for analyzing digital devices through increasing the dimension of the data structures and memory are proposed. The basic concepts, terminology and definitions necessary for the implementation of quantum computing when analyzing virtual computers are introduced. The investigation results concerning design and modeling computer systems in a cyberspace based on the use of two-component structure are presented.

  7. Computational Intelligence in Information Systems Conference

    CERN Document Server

    Au, Thien-Wan; Omar, Saiful


    This book constitutes the Proceedings of the Computational Intelligence in Information Systems conference (CIIS 2016), held in Brunei, November 18–20, 2016. The CIIS conference provides a platform for researchers to exchange the latest ideas and to present new research advances in general areas related to computational intelligence and its applications. The 26 revised full papers presented in this book have been carefully selected from 62 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.

  8. Current therapy of systemic sclerosis (scleroderma). (United States)

    Müller-Ladner, U; Benning, K; Lang, B


    Treatment of systemic sclerosis (scleroderma) presents a challenge to both the patient and the physician. Established approaches include long-term physiotherapy, disease-modifying agents such as D-penicillamine, and treatment of organ involvement. These efforts are often unsatisfactory since the results are poor. However, recent advances include treatment of Raynaud's phenomenon (plasmapheresis, stanozolol, and prostacyclin analogues), scleroderma renal crisis (angiotensin-converting enzyme inhibitors), and gastric hypomotility (cisapride). This article covers the current approaches to the disease-modifying therapy including those related to the function of collagen-producing fibroblasts, vascular alterations, and the cellular and humoral immune system, as well as treatment of involved organs.

  9. Impact of currents on surface flux computations and their feedback on dynamics at regional scales (United States)

    Olita, A.; Iermano, I.; Fazioli, L.; Ribotti, A.; Tedesco, C.; Pessini, F.; Sorgente, R.


    A twin numerical experiment was conducted in the seas around the island of Sardinia (Western Mediterranean) to assess the impact, at regional and coastal scales, of the use of relative winds (i.e., taking into account ocean surface currents) in the computation of heat and momentum fluxes through standard (Fairall et al., 2003) bulk formulas. The Regional Ocean Modelling System (ROMS) was implemented at 3 km resolution in order to well resolve mesoscale processes, which are known to have a large influence in the dynamics of the area. Small changes (few percent points) in terms of spatially averaged fluxes correspond to quite large differences of such quantities (about 15 %) in spatial terms and in terms of kinetics (more than 20 %). As a consequence, wind power input P is also reduced by ~ 14 % on average. Quantitative validation with satellite SST suggests that such a modification of the fluxes improves the model solution especially in the western side of the domain, where mesoscale activity (as suggested by eddy kinetic energy) is stronger. Surface currents change both in their stable and fluctuating part. In particular, the path and intensity of the Algerian Current and of the Western Sardinia Current (WSC) are impacted by the modification in fluxes. Both total and eddy kinetic energies of the surface current field are reduced in the experiment where fluxes took into account the surface currents. The main dynamical correction is observed in the SW area, where the different location and strength of the eddies influence the path and intensity of the WSC. Our results suggest that, even at local scales and in temperate regions, it would be preferable to take into account such a contribution in flux computations. The modification of the original code, substantially cost-less in terms of numerical computation, improves the model response in terms of surface fluxes (SST validated) and it also likely improves the dynamics as suggested by qualitative comparison with

  10. Optimization of Operating Systems towards Green Computing

    Directory of Open Access Journals (Sweden)

    Appasami Govindasamy


    Full Text Available Green Computing is one of the emerging computing technology in the field of computer science engineering and technology to provide Green Information Technology (Green IT. It is mainly used to protect environment, optimize energy consumption and keeps green environment. Green computing also refers to environmentally sustainable computing. In recent years, companies in the computer industry have come to realize that going green is in their best interest, both in terms of public relations and reduced costs. Information and communication technology (ICT has now become an important department for the success of any organization. Making IT “Green” can not only save money but help save our world by making it a better place through reducing and/or eliminating wasteful practices. In this paper we focus on green computing by optimizing operating systems and scheduling of hardware resources. The objectives of the green computing are human power, electrical energy, time and cost reduction with out polluting the environment while developing the software. Operating System (OS Optimization is very important for Green computing, because it is bridge for both hardware components and Application Soft wares. The important Steps for green computing user and energy efficient usage are also discussed in this paper.

  11. Resilience assessment and evaluation of computing systems

    CERN Document Server

    Wolter, Katinka; Vieira, Marco


    The resilience of computing systems includes their dependability as well as their fault tolerance and security. It defines the ability of a computing system to perform properly in the presence of various kinds of disturbances and to recover from any service degradation. These properties are immensely important in a world where many aspects of our daily life depend on the correct, reliable and secure operation of often large-scale distributed computing systems. Wolter and her co-editors grouped the 20 chapters from leading researchers into seven parts: an introduction and motivating examples,

  12. Establishing performance requirements of computer based systems subject to uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, D.


    An organized systems design approach is dictated by the increasing complexity of computer based systems. Computer based systems are unique in many respects but share many of the same problems that have plagued design engineers for decades. The design of complex systems is difficult at best, but as a design becomes intensively dependent on the computer processing of external and internal information, the design process quickly borders chaos. This situation is exacerbated with the requirement that these systems operate with a minimal quantity of information, generally corrupted by noise, regarding the current state of the system. Establishing performance requirements for such systems is particularly difficult. This paper briefly sketches a general systems design approach with emphasis on the design of computer based decision processing systems subject to parameter and environmental variation. The approach will be demonstrated with application to an on-board diagnostic (OBD) system for automotive emissions systems now mandated by the state of California and the Federal Clean Air Act. The emphasis is on an approach for establishing probabilistically based performance requirements for computer based systems.

  13. The evolution of the PVM concurrent computing system

    Energy Technology Data Exchange (ETDEWEB)

    Giest, G.A. [Oak Ridge National Lab., TN (United States); Sunderam, V.S. [Emory Univ., Atlanta, GA (United States). Dept. of Mathematics and Computer Science


    Concurrent and distributed computing, using portable software systems or environments on general purpose networked computing platforms, has recently gained widespread attention. Many such systems have been developed, and several are in production use. This paper describes the evolution of the PVM system, a software infrastructure for concurrent computing in networked environments. PVM has evolved over the past years; it is currently in use at several hundred institutions worldwide for applications ranging from scientific supercomputing to high performance computations in medicine, discrete mathematics, and databases, and for learning parallel programming. We describe the historical evolution of the PVM system, outline the programming model and supported features, present results gained from its use, list representative applications from a variety of disciplines that PVM has been used for, and comment on future trends and ongoing research projects.

  14. Computation of charged current neutrino-Te reactions cross sections (United States)

    Tsakstara, V.; Kosmas, T. S.; Sinatkas, J.


    Neutrino-nucleus reactions, involving both neutral current (NC) and charged current (CC) interactions are important probes in modern neutrino physics searches. In the present work, we study the concrete CC reactions 130 Te(vℓ,ℓ-)130 I and 130 Te(ṽℓ,ℓ+)130Sb which are of current experimental interest for the CUORE and COBRA experiments operating at Gran Sasso underground laboratory in Italy. The nuclear wave functions for the required initial and final nuclear states are derived by employing the proton-neutron (p-n) quasi-particle random phase approximation (QRPA) which has been previously tested in our neutral-current v-nucleus studies for Te isotopes.

  15. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B


    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  16. Rendezvous Facilities in a Distributed Computer System

    Institute of Scientific and Technical Information of China (English)

    廖先Zhi; 金兰


    The distributed computer system described in this paper is a set of computer nodes interconnected in an interconnection network via packet-switching interfaces.The nodes communicate with each other by means of message-passing protocols.This paper presents the implementation of rendezvous facilities as high-level primitives provided by a parallel programming language to support interprocess communication and synchronization.

  17. Cloud Computing in the Curricula of Schools of Computer Science and Information Systems (United States)

    Lawler, James P.


    The cloud continues to be a developing area of information systems. Evangelistic literature in the practitioner field indicates benefit for business firms but disruption for technology departments of the firms. Though the cloud currently is immature in methodology, this study defines a model program by which computer science and information…

  18. Current Cloud Computing Security Concerns from Consumer Perspective

    Institute of Scientific and Technical Information of China (English)

    Hafiz Gulfam Ahmad; Zeeshan Ahmad


    In recent years cloud computing is the subject of extensive research in the emerging field of information technology and has become a promising business.The reason behind this widespread interest is its abilityto increase the capacity and capability of enterprises,having no investment for new infrastructure,no software license requirement and no need of any training. Security concern is the main limitation factor in the growth of this new born technology.The secur-ity responsibilities of both,the provider and the consumer greatly differ between cloud service models.In this paper we discuss a variety of security risks,authentication issues,trust,and legal regularity in cloud environment with consumer perspective.Early research focused only on techni-cal and business consequences of cloud computing and ignored consumer perspective.There-fore,this paper discusses the consumer security and privacy preferences.

  19. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George


    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  20. What's New in Software? Current Sources of Information Boost Effectiveness of Computer-Assisted Instruction. (United States)

    Ellsworth, Nancy J.


    This article reviews current resources on computer-assisted instruction. Included are sources of software and hardware evaluations, advances in current technology, research, an information hotline, and inventories of available technological assistance. (DB)

  1. Sandia Laboratories technical capabilities: computation systems

    Energy Technology Data Exchange (ETDEWEB)


    This report characterizes the computation systems capabilities at Sandia Laboratories. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs. 9 figures.

  2. Console Networks for Major Computer Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ophir, D; Shepherd, B; Spinrad, R J; Stonehill, D


    A concept for interactive time-sharing of a major computer system is developed in which satellite computers mediate between the central computing complex and the various individual user terminals. These techniques allow the development of a satellite system substantially independent of the details of the central computer and its operating system. Although the user terminals' roles may be rich and varied, the demands on the central facility are merely those of a tape drive or similar batched information transfer device. The particular system under development provides service for eleven visual display and communication consoles, sixteen general purpose, low rate data sources, and up to thirty-one typewriters. Each visual display provides a flicker-free image of up to 4000 alphanumeric characters or tens of thousands of points by employing a swept raster picture generating technique directly compatible with that of commercial television. Users communicate either by typewriter or a manually positioned light pointer.

  3. The structural robustness of multiprocessor computing system

    Directory of Open Access Journals (Sweden)

    N. Andronaty


    Full Text Available The model of the multiprocessor computing system on the base of transputers which permits to resolve the question of valuation of a structural robustness (viability, survivability is described.

  4. Computational Models for Nonlinear Aeroelastic Systems Project (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  5. 10 CFR 73.54 - Protection of digital computer and communication systems and networks. (United States)


    ... 10 Energy 2 2010-01-01 2010-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are...

  6. Cloud Computing for Standard ERP Systems

    DEFF Research Database (Denmark)

    Schubert, Petra; Adisa, Femi

    for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels...... of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda....

  7. Computer support for mechatronic control system design

    NARCIS (Netherlands)

    van Amerongen, J.; Coelingh, H.J.; de Vries, Theodorus J.A.


    This paper discusses the demands for proper tools for computer aided control system design of mechatronic systems and identifies a number of tasks in this design process. Real mechatronic design, involving input from specialists from varying disciplines, requires that the system can be represented

  8. Computer Systems for Distributed and Distance Learning. (United States)

    Anderson, M.; Jackson, David


    Discussion of network-based learning focuses on a survey of computer systems for distributed and distance learning. Both Web-based systems and non-Web-based systems are reviewed in order to highlight some of the major trends of past projects and to suggest ways in which progress may be made in the future. (Contains 92 references.) (Author/LRW)

  9. Virtual smile design systems: a current review. (United States)

    Zimmermann, Moritz; Mehl, Albert


    In the age of digital dentistry, virtual treatment planning is becoming an increasingly important element of dental practice. Thanks to new technological advances in the computer- assisted design and computer-assisted manufacturing (CAD/CAM) of dental restorations, predictable interdisciplinary treatment using the backward planning approach appears useful and feasible. Today, a virtual smile design can be used as the basis for creating an esthetic virtual setup of the desired final result. The virtual setup, in turn, is used to plan further treatment steps in an interdisciplinary team approach, and communicate the results to the patient. The smile design concept and the esthetic analyses required for it are described in this article. We include not only a step-by-step description of the virtual smile design workflow, but also describe and compare the several available smile design options and systems. Subsequently, a brief discussion of the advantages and limitations of virtual smile design is followed by a section on different ways to integrate a two-dimensional (2D) smile design into the digital three-dimensional (3D) workflow. New technological developments are also described, such as the integration of smile designs in digital face scans, and 3D diagnostic follow-up using intraoral scanners.

  10. SLA for E-Learning System Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Doaa Elmatary


    Full Text Available The Service Level Agreement (SLA becomes an important issue especially over the Cloud Computing and online services that based on the ‘pay-as-you-use’ fashion. Establishing the Service level agreements (SLAs, which can be defined as a negotiation between the service provider and the user, is needed for many types of current applications as the E-Learning systems. The work in this paper presents an idea of optimizing the SLA parameters to serve any E-Learning system over the Cloud Computing platform, with defining the negotiation process, the suitable frame work, and the sequence diagram to accommodate the E-Learning systems.

  11. The current state of computing in building design and construction

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt; Andersen, Tom


    An exhaustive IT-survey is made in the Danish construction industry in 1995. The outcome of this survey is outlined for a selected number of issues, such as number and types of CAD-systems, data base systems, networking etc. The survey shows, that the use of IT is massive. A detailed investigation...... in relation to AI-based systems is presented. This indicates that AI-based systems/approaches have not yet penetrated in the Danish construction industry. Determinants relevant to technology penetration are described, and the immediate potential for AI-penetration are discussed and compared with the actual...

  12. Computational Modeling, Formal Analysis, and Tools for Systems Biology. (United States)

    Bartocci, Ezio; Lió, Pietro


    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  13. Information systems and computing technology

    CERN Document Server

    Zhang, Lei


    Invited papersIncorporating the multi-cross-sectional temporal effect in Geographically Weighted Logit Regression K. Wu, B. Liu, B. Huang & Z. LeiOne shot learning human actions recognition using key posesW.H. Zou, S.G. Li, Z. Lei & N. DaiBand grouping pansharpening for WorldView-2 satellite images X. LiResearch on GIS based haze trajectory data analysis system Y. Wang, J. Chen, J. Shu & X. WangRegular papersA warning model of systemic financial risks W. Xu & Q. WangResearch on smart mobile phone user experience with grounded theory J.P. Wan & Y.H. ZhuThe software reliability analysis based on

  14. The current state of computing in building design and construction

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt; Andersen, Tom


    An exhaustive IT-survey is made in the Danish construction industry in 1995. The outcome of this survey is outlined for a selected number of issues, such as number and types of CAD-systems, data base systems, networking etc. The survey shows, that the use of IT is massive. A detailed investigation...... in relation to AI-based systems is presented. This indicates that AI-based systems/approaches have not yet penetrated in the Danish construction industry. Determinants relevant to technology penetration are described, and the immediate potential for AI-penetration are discussed and compared with the actual...... status. Finally, we conclude that there is a severe lack of awareness about AI-systems in the building sector....

  15. Computational systems biology for aging research. (United States)

    Mc Auley, Mark T; Mooney, Kathleen M


    Computational modelling is a key component of systems biology and integrates with the other techniques discussed thus far in this book by utilizing a myriad of data that are being generated to quantitatively represent and simulate biological systems. This chapter will describe what computational modelling involves; the rationale for using it, and the appropriateness of modelling for investigating the aging process. How a model is assembled and the different theoretical frameworks that can be used to build a model are also discussed. In addition, the chapter will describe several models which demonstrate the effectiveness of each computational approach for investigating the constituents of a healthy aging trajectory. Specifically, a number of models will be showcased which focus on the complex age-related disorders associated with unhealthy aging. To conclude, we discuss the future applications of computational systems modelling to aging research.

  16. Artificial immune system applications in computer security

    CERN Document Server

    Tan, Ying


    This book provides state-of-the-art information on the use, design, and development of the Artificial Immune System (AIS) and AIS-based solutions to computer security issues. Artificial Immune System: Applications in Computer Security focuses on the technologies and applications of AIS in malware detection proposed in recent years by the Computational Intelligence Laboratory of Peking University (CIL@PKU). It offers a theoretical perspective as well as practical solutions for readers interested in AIS, machine learning, pattern recognition and computer security. The book begins by introducing the basic concepts, typical algorithms, important features, and some applications of AIS. The second chapter introduces malware and its detection methods, especially for immune-based malware detection approaches. Successive chapters present a variety of advanced detection approaches for malware, including Virus Detection System, K-Nearest Neighbour (KNN), RBF networ s, and Support Vector Machines (SVM), Danger theory, ...

  17. Quantum Computing in Solid State Systems

    CERN Document Server

    Ruggiero, B; Granata, C


    The aim of Quantum Computation in Solid State Systems is to report on recent theoretical and experimental results on the macroscopic quantum coherence of mesoscopic systems, as well as on solid state realization of qubits and quantum gates. Particular attention has been given to coherence effects in Josephson devices. Other solid state systems, including quantum dots, optical, ion, and spin devices which exhibit macroscopic quantum coherence are also discussed. Quantum Computation in Solid State Systems discusses experimental implementation of quantum computing and information processing devices, and in particular observations of quantum behavior in several solid state systems. On the theoretical side, the complementary expertise of the contributors provides models of the various structures in connection with the problem of minimizing decoherence.

  18. Design of current meter data acquisition system based on AVR single chip computer%基干AVR单片机的海流计数据采集系统设计

    Institute of Scientific and Technical Information of China (English)

    陈长安; 吴建岚; 王升


    Since the common used direct reading current meter has the disadvantages of high-power consumption,short con-tinuous working time,low reliability and no storage function,a new current meter data acquisition system with low-power con-sumption and high reliability was designed on the basis of basic principle of the counting interrupt to improve the reliability of the direct reading current meter. The AVR microcomputer ATmega8 is adopted as kernel processor in the data acquisition sys-tem. The pulse signal form of underwater part is utilized in the system design. The testing results of prototype machine show that the system features easy operation,stable performance and low cost,and can work continuously for a long time under water. It can fully meet the requirements of various current measurements.%目前业内普遍使用的直读式海流计功耗大,连续工作时间短,且无数据存储功能,可靠性不高,在需长时间进行海洋观测使用时十分不便。针对这一情况,为提高直读式海流计的可靠性,以AVR单片机ATmega8为核心处理器,基于水下分机输出脉冲信号形式,采用中断计数的基本原理,重新设计开发了一种低功耗高可靠的海流计数据采集系统。样机测试结果表明:该系统操作简单、性能稳定,可长时间在水下连续可靠工作,完全满足各种海洋作业中海流测量需求。

  19. Computational electromagnetics and model-based inversion a modern paradigm for eddy-current nondestructive evaluation

    CERN Document Server

    Sabbagh, Harold A; Sabbagh, Elias H; Aldrin, John C; Knopp, Jeremy S


    Computational Electromagnetics and Model-Based Inversion: A Modern Paradigm for Eddy Current Nondestructive Evaluation describes the natural marriage of the computer to eddy-current NDE. Three distinct topics are emphasized in the book: (a) fundamental mathematical principles of volume-integral equations as a subset of computational electromagnetics, (b) mathematical algorithms applied to signal-processing and inverse scattering problems, and (c) applications of these two topics to problems in which real and model data are used. By showing how mathematics and the computer can solve problems more effectively than current analog practices, this book defines the modern technology of eddy-current NDE. This book will be useful to advanced students and practitioners in the fields of computational electromagnetics, electromagnetic inverse-scattering theory, nondestructive evaluation, materials evaluation and biomedical imaging. Users of eddy-current NDE technology in industries as varied as nuclear power, aerospace,...

  20. Impact of currents on surface fluxes computation and their feedback on coastal dynamics

    Directory of Open Access Journals (Sweden)

    A. Olita


    Full Text Available A twin numerical experiment was conducted in the seas of Sardinia (Western Mediterranean to assess the impact, at coastal scales, of the use of relative winds (i.e. taking into account ocean surface currents in the computation of heat and momentum fluxes through bulk formulas. The model, the Regional Ocean Modeling System (ROMS, was implemented at 2 km of resolution in order to well resolve (sub-mesoscale dynamics. Small changes (1–2% in terms of spatially-averaged fluxes correspond to quite large spatial differences of such quantities (up to 15–20% and to comparably significant differences in terms of mean velocities of the surface currents. Wind power input of the wind stress to the ocean surface P results also reduced by a 15%, especially where surface currents are stronger. Quantitative validation with satellite SST suggests that such a modification on the fluxes improves the model solution especially in areas of cyclonic circulation, where the heat fluxes correction is predominant in respect to the dynamical correction. Surface currents changes above all in their fluctuating part, while the stable part of the flow show changes mainly in magnitude and less in its path. Both total and eddy kinetic energies of the surface current field results reduced in the experiment where fluxes took into account for surface currents. Dynamically, the largest correction is observed in the SW area where anticyclonic eddies approach the continental slope. This reduction also impacts the vertical dynamics and specifically the local upwelling that results diminished both in spatial extension as well in magnitude. Simulations suggest that, even at local scales and in temperate regions, it is preferable to take into account for such a component in fluxes computation. Results also confirm the tight relationship between local coastal upwelling and eddy-slope interactions in the area.

  1. Computer-Aided Design of a Direct Current Electromagnet

    Directory of Open Access Journals (Sweden)

    Iancu Tătucu


    Full Text Available The paper presents the mathematical model and the simulation of a direct current electromagnet used for the transport of the steel ingots. For the simulation of any device one must dispose of a mathematical model, able to describe as accurately as possible the phenomena that take place. As the processes occurred in the case of an electromagnet are of an electromagnetic nature, one used the model of the electromagnetic potentials, and the simulation was performed with the help of the specialised software ANSYS.

  2. Soft computing in green and renewable energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, Kasthurirangan [Iowa State Univ., Ames, IA (United States). Iowa Bioeconomy Inst.; US Department of Energy, Ames, IA (United States). Ames Lab; Kalogirou, Soteris [Cyprus Univ. of Technology, Limassol (Cyprus). Dept. of Mechanical Engineering and Materials Sciences and Engineering; Khaitan, Siddhartha Kumar (eds.) [Iowa State Univ. of Science and Technology, Ames, IA (United States). Dept. of Electrical Engineering and Computer Engineering


    Soft Computing in Green and Renewable Energy Systems provides a practical introduction to the application of soft computing techniques and hybrid intelligent systems for designing, modeling, characterizing, optimizing, forecasting, and performance prediction of green and renewable energy systems. Research is proceeding at jet speed on renewable energy (energy derived from natural resources such as sunlight, wind, tides, rain, geothermal heat, biomass, hydrogen, etc.) as policy makers, researchers, economists, and world agencies have joined forces in finding alternative sustainable energy solutions to current critical environmental, economic, and social issues. The innovative models, environmentally benign processes, data analytics, etc. employed in renewable energy systems are computationally-intensive, non-linear and complex as well as involve a high degree of uncertainty. Soft computing technologies, such as fuzzy sets and systems, neural science and systems, evolutionary algorithms and genetic programming, and machine learning, are ideal in handling the noise, imprecision, and uncertainty in the data, and yet achieve robust, low-cost solutions. As a result, intelligent and soft computing paradigms are finding increasing applications in the study of renewable energy systems. Researchers, practitioners, undergraduate and graduate students engaged in the study of renewable energy systems will find this book very useful. (orig.)

  3. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad


    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  4. Telemetry Computer System at Wallops Flight Center (United States)

    Bell, H.; Strock, J.


    This paper describes the Telemetry Computer System in operation at NASA's Wallops Flight Center for real-time or off-line processing, storage, and display of telemetry data from rockets and aircraft. The system accepts one or two PCM data streams and one FM multiplex, converting each type of data into computer format and merging time-of-day information. A data compressor merges the active streams, and removes redundant data if desired. Dual minicomputers process data for display, while storing information on computer tape for further processing. Real-time displays are located at the station, at the rocket launch control center, and in the aircraft control tower. The system is set up and run by standard telemetry software under control of engineers and technicians. Expansion capability is built into the system to take care of possible future requirements.

  5. Honeywell Modular Automation System Computer Software Documentation

    Energy Technology Data Exchange (ETDEWEB)



    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2.

  6. Computation and design of autonomous intelligent systems (United States)

    Fry, Robert L.


    This paper describes a theory of intelligent systems and its reduction to engineering practice. The theory is based on a broader theory of computation wherein information and control are defined within the subjective frame of a system. At its most primitive level, the theory describes what it computationally means to both ask and answer questions which, like traditional logic, are also Boolean. The logic of questions describes the subjective rules of computation that are objective in the sense that all the described systems operate according to its principles. Therefore, all systems are autonomous by construct. These systems include thermodynamic, communication, and intelligent systems. Although interesting, the important practical consequence is that the engineering framework for intelligent systems can borrow efficient constructs and methodologies from both thermodynamics and information theory. Thermodynamics provides the Carnot cycle which describes intelligence dynamics when operating in the refrigeration mode. It also provides the principle of maximum entropy. Information theory has recently provided the important concept of dual-matching useful for the design of efficient intelligent systems. The reverse engineered model of computation by pyramidal neurons agrees well with biology and offers a simple and powerful exemplar of basic engineering concepts.

  7. Remote computer monitors corrosion protection system

    Energy Technology Data Exchange (ETDEWEB)

    Kendrick, A.

    Effective corrosion protection with electrochemical methods requires some method of routine monitoring that provides reliable data that is free of human error. A test installation of a remote computer control monitoring system for electrochemical corrosion protection is described. The unit can handle up to six channel inputs. Each channel comprises 3 analog signals and 1 digital. The operation of the system is discussed.

  8. Terrace Layout Using a Computer Assisted System (United States)

    Development of a web-based terrace design tool based on the MOTERR program is presented, along with representative layouts for conventional and parallel terrace systems. Using digital elevation maps and geographic information systems (GIS), this tool utilizes personal computers to rapidly construct ...

  9. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  10. An E-learning System based on Affective Computing (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  11. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L


    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  12. Unified Computational Intelligence for Complex Systems

    CERN Document Server

    Seiffertt, John


    Computational intelligence encompasses a wide variety of techniques that allow computation to learn, to adapt, and to seek. That is, they may be designed to learn information without explicit programming regarding the nature of the content to be retained, they may be imbued with the functionality to adapt to maintain their course within a complex and unpredictably changing environment, and they may help us seek out truths about our own dynamics and lives through their inclusion in complex system modeling. These capabilities place our ability to compute in a category apart from our ability to e

  13. An Improved Electron Pre-Sheath Model for TSS-1R Current Enhancement Computations

    Directory of Open Access Journals (Sweden)

    Chunpei Cai


    Full Text Available This report presents improvements of investigations on the Tethered Satellite System (TSS-1R electron current enhancement due to magnetic limited collections. New analytical expressions are obtained for the potential and temperature changes across the pre-sheath. The mathematical treatments in this work are more rigorous than one past approach. More experimental measurements collected in the ionosphere during the TSS-1R mission are adopted for validations. The relations developed in this work offer two bounding curves for these data points quite successfully; the average of these two curves is close to the curve-fitting results for the measurements; and an average of 2.95 times larger than the Parker-Murphy theory is revealed. The results indicate that including the pre-sheath analysis is important to compute the electron current enhancement due to magnetic limitations.

  14. 8th International Conference on Computer Recognition Systems

    CERN Document Server

    Jackowski, Konrad; Kurzynski, Marek; Wozniak, Michał; Zolnierek, Andrzej


    The computer recognition systems are nowadays one of the most promising directions in artificial intelligence. This book is the most comprehensive study of this field. It contains a collection of 86 carefully selected articles contributed by experts of pattern recognition. It reports on current research with respect to both methodology and applications. In particular, it includes the following sections: Biometrics Data Stream Classification and Big Data Analytics  Features, learning, and classifiers Image processing and computer vision Medical applications Miscellaneous applications Pattern recognition and image processing in robotics  Speech and word recognition This book is a great reference tool for scientists who deal with the problems of designing computer pattern recognition systems. Its target readers can be the as well researchers as students of computer science, artificial intelligence or robotics.

  15. 9th International Conference on Computer Recognition Systems

    CERN Document Server

    Jackowski, Konrad; Kurzyński, Marek; Woźniak, Michał; Żołnierek, Andrzej


    The computer recognition systems are nowadays one of the most promising directions in artificial intelligence. This book is the most comprehensive study of this field. It contains a collection of 79 carefully selected articles contributed by experts of pattern recognition. It reports on current research with respect to both methodology and applications. In particular, it includes the following sections: Features, learning, and classifiers Biometrics Data Stream Classification and Big Data Analytics Image processing and computer vision Medical applications Applications RGB-D perception: recent developments and applications This book is a great reference tool for scientists who deal with the problems of designing computer pattern recognition systems. Its target readers can be the as well researchers as students of computer science, artificial intelligence or robotics.  .

  16. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)


    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  17. Fault tolerant hypercube computer system architecture (United States)

    Madan, Herb S. (Inventor); Chow, Edward (Inventor)


    A fault-tolerant multiprocessor computer system of the hypercube type comprising a hierarchy of computers of like kind which can be functionally substituted for one another as necessary is disclosed. Communication between the working nodes is via one communications network while communications between the working nodes and watch dog nodes and load balancing nodes higher in the structure is via another communications network separate from the first. A typical branch of the hierarchy reporting to a master node or host computer comprises, a plurality of first computing nodes; a first network of message conducting paths for interconnecting the first computing nodes as a hypercube. The first network provides a path for message transfer between the first computing nodes; a first watch dog node; and a second network of message connecting paths for connecting the first computing nodes to the first watch dog node independent from the first network, the second network provides an independent path for test message and reconfiguration affecting transfers between the first computing nodes and the first switch watch dog node. There is additionally, a plurality of second computing nodes; a third network of message conducting paths for interconnecting the second computing nodes as a hypercube. The third network provides a path for message transfer between the second computing nodes; a fourth network of message conducting paths for connecting the second computing nodes to the first watch dog node independent from the third network. The fourth network provides an independent path for test message and reconfiguration affecting transfers between the second computing nodes and the first watch dog node; and a first multiplexer disposed between the first watch dog node and the second and fourth networks for allowing the first watch dog node to selectively communicate with individual ones of the computing nodes through the second and fourth networks; as well as, a second watch dog node

  18. Catalytic currents in dithiophosphate-iodide systems

    Energy Technology Data Exchange (ETDEWEB)

    Gabdullin, M.G.; Garifzyanov, A.R.; Toropova, V.F.


    Catalytic currents of oxidizing agents are used to determinerate constants of simultaneous chemical reactions. In the present paper, the authors investigated electrochemical oxidation of iodide ions in the presence of a series of dithiophosphates (RO)/sub 2/PSS/sup -/ at a glassy carbon electrode n that (R=CH/sub 3/, C/sub 2/H/sub 5/, n-C/sub 3/H/sub 7/, n-C/sub 4/H/sub 9/, iso-C/sub 4/H/sub 9/, and sec-C/sub 4/H/sub 9/). It is know n that dithiophosphates (DTP) are strong reducing agents and are oxidized by iodine. At the same time, as shown previously, electrochemical oxidation of DTP occurs at more positive potentials in comparision with the oxidation potential of iodide ions. This suggested that it is possible for a catalytic effect to be manifested in DTP-I/sup -/ systems. Current-voltage curves are shown for solutions of I/sup -/ in the absence and in the presence of DTP. All data indicate a catalytic nature of the electrode process. The obtained data show that the rates of reactions of DTP with iodine decrease with increasing volume and branching of the substituents at the phosphorus atom.

  19. Research on computer virus database management system (United States)

    Qi, Guoquan


    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  20. Computational Methods for Predictive Simulation of Stochastic Turbulence Systems (United States)


    AFRL-AFOSR-VA-TR-2015-0363 Computational Methods for Predictive Simulation of Stochastic Turbulence Systems Catalin Trenchea UNIVERSITY OF PITTSBURGH...STOCHASTIC TURBULENCE SYSTEMS AFOSR GRANT FA 9550-12-1-0191 William Layton and Catalin Trenchea Department of Mathematics University of Pittsburgh...During Duration of Grant Nan Jian Graduate student, Univ . of Pittsburgh (currently Postdoc at FSU) Sarah Khankan Graduate student, Univ . of Pittsburgh

  1. Monitoring SLAC High Performance UNIX Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC


    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.

  2. Operator support system using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bueno, Elaine Inacio, E-mail: [Instituto Federal de Educacao, Ciencia e Tecnologia de Sao Paulo (IFSP), Sao Paulo, SP (Brazil); Pereira, Iraci Martinez, E-mail: [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)


    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  3. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    Within the last five to ten years we have experienced an incredible growth of ubiquitous technologies which has allowed for improvements in several areas, including energy distribution and management, health care services, border surveillance, secure monitoring and management of buildings......, localisation services and many others. These technologies can be classified under the name of ubiquitous systems. The term Ubiquitous System dates back to 1991 when Mark Weiser at Xerox PARC Lab first referred to it in writing. He envisioned a future where computing technologies would have been melted...... in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...

  4. Metasynthetic computing and engineering of complex systems

    CERN Document Server

    Cao, Longbing


    Provides a comprehensive overview and introduction to the concepts, methodologies, analysis, design and applications of metasynthetic computing and engineering. The author: Presents an overview of complex systems, especially open complex giant systems such as the Internet, complex behavioural and social problems, and actionable knowledge discovery and delivery in the big data era. Discusses ubiquitous intelligence in complex systems, including human intelligence, domain intelligence, social intelligence, network intelligence, data intelligence and machine intelligence, and their synergy thro

  5. Reliable computer systems design and evaluatuion

    CERN Document Server

    Siewiorek, Daniel


    Enhance your hardware/software reliabilityEnhancement of system reliability has been a major concern of computer users and designers ¦ and this major revision of the 1982 classic meets users' continuing need for practical information on this pressing topic. Included are case studies of reliablesystems from manufacturers such as Tandem, Stratus, IBM, and Digital, as well as coverage of special systems such as the Galileo Orbiter fault protection system and AT&T telephone switching processors.

  6. Model for personal computer system selection. (United States)

    Blide, L


    Successful computer software and hardware selection is best accomplished by following an organized approach such as the one described in this article. The first step is to decide what you want to be able to do with the computer. Secondly, select software that is user friendly, well documented, bug free, and that does what you want done. Next, you select the computer, printer and other needed equipment from the group of machines on which the software will run. Key factors here are reliability and compatibility with other microcomputers in your facility. Lastly, you select a reliable vendor who will provide good, dependable service in a reasonable time. The ability to correctly select computer software and hardware is a key skill needed by medical record professionals today and in the future. Professionals can make quality computer decisions by selecting software and systems that are compatible with other computers in their facility, allow for future net-working, ease of use, and adaptability for expansion as new applications are identified. The key to success is to not only provide for your present needs, but to be prepared for future rapid expansion and change in your computer usage as technology and your skills grow.

  7. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang


    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  8. Analisis Teknik-Teknik Keamanan Pada Future Cloud Computing vs Current Cloud Computing: Survey Paper

    Directory of Open Access Journals (Sweden)

    Beny Nugraha


    Full Text Available Cloud computing adalah salah satu dari teknologi jaringan yang sedang berkembang pesat saat ini, hal ini dikarenakan cloud computing memiliki kelebihan dapat meningkatkan fleksibilitas dan kapabilitas dari proses komputer secara dinamis tanpa perlu mengeluarkan dana besar untuk membuat infrastruktur baru, oleh karena itu, peningkatan kualitas keamanan jaringan cloud computing sangat diperlukan. Penelitian ini akan meneliti teknik-teknik keamanan yang ada pada cloud computing saat ini dan arsitektur cloud computing masa depan, yaitu NEBULA. Teknik-teknik keamanan tersebut akan dibandingkan dalam hal kemampuannya dalam menangani serangan-serangan keamanan yang mungkin terjadi pada cloud computing. Metode yang digunakan pada penelitian ini adalah metode attack centric, yaitu setiap serangan keamanan dianalisis karakteristiknya dan kemudian diteliti mekanisme keamanan untuk menanganinya. Terdapat empat serangan keamanan yang diteliti dalam penelitian ini, dengan mengetahui bagaimana cara kerja sebuah serangan keamanan, maka akan diketahui juga mekanisme keamanan yang mana yang bisa mengatasi serangan tersebut. Dari penelitian ini didapatkan bahwa NEBULA memiliki tingkat keamanan yang paling tinggi. NEBULA memiliki tiga teknik baru yaitu Proof of Consent (PoC, Proof of Path (PoP, dan teknik kriptografi ICING. Ketiga teknik tersebut ditambah dengan teknik onion routing dapat mengatasi serangan keamanan yang dianalisa pada penelitian ini.

  9. Computer-aided diagnosis in radiological imaging: current status and future challenges (United States)

    Doi, Kunio


    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  10. Current status of dental caries diagnosis using cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Park, Young Seok; Ahn, Jin Soo; Kwon, Ho Beom; Lee, Seung Pyo [School of Dentistry, Seoul National University, Seoul (Korea, Republic of)


    The purpose of this article is to review the current status of dental caries diagnosis using cone beam computed tomography (CBCT). An online PubMed search was performed to identify studies on caries research using CBCT. Despite its usefulness, there were inherent limitations in the detection of caries lesions through conventional radiograph mainly due to the two-dimensional (2D) representation of caries lesions. Several efforts were made to investigate the three-dimensional (3D) image of lesion, only to gain little popularity. Recently, CBCT was introduced and has been used for diagnosis of caries in several reports. Some of them maintained the superiority of CBCT systems, however it is still under controversies. The CBCT systems are promising, however they should not be considered as a primary choice of caries diagnosis in everyday practice yet. Further studies under more standardized condition should be performed in the near future.

  11. Dynamics of the southern California current system (United States)

    di Lorenzo, Emanuele

    The dynamics of seasonal to long-term variability of the Southern California Current System (SCCS) is studied using a four dimensional space-time analysis of the 52 year (1949--2000) California Cooperative Oceanic Fisheries Investigations (CalCOFI) hydrography combined with a sensitivity analysis of an eddy permitting primitive equation ocean model under various forcing scenarios. The dynamics of the seasonal cycle in the SCCS can be summarized as follows. In spring upwelling favorable winds force an upward tilt of the isopycnals along the coast (equatorward flow). Quasi-linear Rossby waves are excited by the ocean adjustment to the isopycnal displacement. In summer as these waves propagate offshore poleward flow develops at the coast and the Southern California Eddy (SCE) reaches its seasonal maxima. Positive wind stress curl in the Southern California Bight is important in maintaining poleward flow and locally reinforcing the SCE with an additional upward displacement of isopycnals through Ekman pumping. At the end of summer and throughout the fall instability processes within the SCE are a generating mechanism for mesoscale eddies, which fully develop in the offshore waters during winter. On decadal timescales a warming trend in temperature (1 C) and a deepening trend in the depth of the mean thermocline (20 m) between 1950 and 1998 are found to be primarily forced by large-scale decadal fluctuations in surface heat fluxes combined with horizontal advection by the mean currents. After 1998 the surface heat fluxes suggest the beginning of a period of cooling, which is consistent with colder observed ocean temperatures. The temporal and spatial distribution of the warming is coherent over the entire northeast Pacific Ocean. Salinity changes are decoupled from temperature and uncorrelated with indices of large-scale oceanic variability. Temporal modulation of southward horizontal advection by the California Current is the primary mechanism controlling local

  12. Monolith ERP and current IT-trends : Creating a step by step development model (SSDM) for existing monolith ERP system to adapt to the current IT-trends


    Vuorenmaa, Riku


    ERP systems have been around for 25 years and have gone through evolution phases. None of the previous phases have however imposed so big a challenge and opportunities for the ERP producers / vendors as current outsourcing and cloud computing trends. Current traditional on-premises (monolith) ERP systems are put under pressure to cope with the demands imposed by cloud computing and other IT-trends (such as mobility). Traditional ERP systems have to adapt or face a risk of becoming obsolete. ...

  13. Architecture, systems research and computational sciences

    CERN Document Server


    The Winter 2012 (vol. 14 no. 1) issue of the Nexus Network Journal is dedicated to the theme “Architecture, Systems Research and Computational Sciences”. This is an outgrowth of the session by the same name which took place during the eighth international, interdisciplinary conference “Nexus 2010: Relationships between Architecture and Mathematics, held in Porto, Portugal, in June 2010. Today computer science is an integral part of even strictly historical investigations, such as those concerning the construction of vaults, where the computer is used to survey the existing building, analyse the data and draw the ideal solution. What the papers in this issue make especially evident is that information technology has had an impact at a much deeper level as well: architecture itself can now be considered as a manifestation of information and as a complex system. The issue is completed with other research papers, conference reports and book reviews.

  14. NIF Integrated Computer Controls System Description

    Energy Technology Data Exchange (ETDEWEB)

    VanArsdall, P.


    This System Description introduces the NIF Integrated Computer Control System (ICCS). The architecture is sufficiently abstract to allow the construction of many similar applications from a common framework. As discussed below, over twenty software applications derived from the framework comprise the NIF control system. This document lays the essential foundation for understanding the ICCS architecture. The NIF design effort is motivated by the magnitude of the task. Figure 1 shows a cut-away rendition of the coliseum-sized facility. The NIF requires integration of about 40,000 atypical control points, must be highly automated and robust, and will operate continuously around the clock. The control system coordinates several experimental cycles concurrently, each at different stages of completion. Furthermore, facilities such as the NIF represent major capital investments that will be operated, maintained, and upgraded for decades. The computers, control subsystems, and functionality must be relatively easy to extend or replace periodically with newer technology.

  15. NIF Integrated Computer Controls System Description

    Energy Technology Data Exchange (ETDEWEB)

    VanArsdall, P.


    This System Description introduces the NIF Integrated Computer Control System (ICCS). The architecture is sufficiently abstract to allow the construction of many similar applications from a common framework. As discussed below, over twenty software applications derived from the framework comprise the NIF control system. This document lays the essential foundation for understanding the ICCS architecture. The NIF design effort is motivated by the magnitude of the task. Figure 1 shows a cut-away rendition of the coliseum-sized facility. The NIF requires integration of about 40,000 atypical control points, must be highly automated and robust, and will operate continuously around the clock. The control system coordinates several experimental cycles concurrently, each at different stages of completion. Furthermore, facilities such as the NIF represent major capital investments that will be operated, maintained, and upgraded for decades. The computers, control subsystems, and functionality must be relatively easy to extend or replace periodically with newer technology.

  16. Current-potential characteristics of electrochemical systems

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, V.S.


    This dissertation contains investigations in three distinct areas. Chapters 1 and 2 provide an analysis of the effects of electromagnetic phenomena during the initial stages of cell discharge. Chapter 1 includes the solution to Maxwell`s equations for the penetration of the axial component of an electric field into an infinitely long cylindrical conductor. Chapter 2 contains the analysis of the conductor included in a radial circuit. Chapter 3 provides a complete description of the equations that describe the growth of an oxide film. A finite difference program was written to solve the equations. The system investigated is the iron/iron oxide in a basic, aqueous solution. Chapters 4 and 5 include the experimental attempts for replacing formaldehyde with an innocuous reducing agent for electroless deposition. In chapter 4, current-versus-voltage curves are provided for a sodium thiosulfate bath in the presence of a copper disk electrode. Also provided are the cathodic polarization curves of a copper/EDTA bath in the presence of a copper electrode. Chapter 5 contains the experimental results of work done with sodium hypophosphite as a reducing agent. Mixed-potential-versus-time curves for solutions containing various combinations of copper sulfate, nickel chloride, and hypophosphite in the presence of a palladium disk electrode provide an indication of the reducing power of the solutions.

  17. Some Unexpected Results Using Computer Algebra Systems. (United States)

    Alonso, Felix; Garcia, Alfonsa; Garcia, Francisco; Hoya, Sara; Rodriguez, Gerardo; de la Villa, Agustin


    Shows how teachers can often use unexpected outputs from Computer Algebra Systems (CAS) to reinforce concepts and to show students the importance of thinking about how they use the software and reflecting on their results. Presents different examples where DERIVE, MAPLE, or Mathematica does not work as expected and suggests how to use them as a…

  18. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine


    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  19. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.


    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data dissem

  20. Computer Graphics for System Effectiveness Analysis. (United States)


    02139, August 1982. Chapra , Steven C., and Raymond P. Canale, (1985), Numerical Methods for Engineers with Personal Computer Applications New York...I -~1.2 Outline of Thesis .................................. 1..... .......... CHAPTER 11. METHOD OF ANALYSIS...Chapter VII summarizes the results and gives recommendations for future research. I - P** METHOD OF ANALYSIS 2.1 Introduction Systems effectiveness

  1. Characterizing Video Coding Computing in Conference Systems

    NARCIS (Netherlands)

    Tuquerres, G.


    In this paper, a number of coding operations is provided for computing continuous data streams, in particular, video streams. A coding capability of the operations is expressed by a pyramidal structure in which coding processes and requirements of a distributed information system are represented. Th

  2. Lumber Grading With A Computer Vision System (United States)

    Richard W. Conners; Tai-Hoon Cho; Philip A. Araman


    Over the past few years significant progress has been made in developing a computer vision system for locating and identifying defects on surfaced hardwood lumber. Unfortunately, until September of 1988 little research had gone into developing methods for analyzing rough lumber. This task is arguably more complex than the analysis of surfaced lumber. The prime...

  3. Computer Algebra Systems, Pedagogy, and Epistemology (United States)

    Bosse, Michael J.; Nandakumar, N. R.


    The advent of powerful Computer Algebra Systems (CAS) continues to dramatically affect curricula, pedagogy, and epistemology in secondary and college algebra classrooms. However, epistemological and pedagogical research regarding the role and effectiveness of CAS in the learning of algebra lags behind. This paper investigates concerns regarding…

  4. Computer system SANC: its development and applications (United States)

    Arbuzov, A.; Bardin, D.; Bondarenko, S.; Christova, P.; Kalinovskaya, L.; Sadykov, R.; Sapronov, A.; Riemann, T.


    The SANC system is used for systematic calculations of various processes within the Standard Model in the one-loop approximation. QED, electroweak, and QCD corrections are computed to a number of processes being of interest for modern and future high-energy experiments. Several applications for the LHC physics program are presented. Development of the system and the general problems and perspectives for future improvement of the theoretical precision are discussed.

  5. Personal healthcare system using cloud computing. (United States)

    Takeuchi, Hiroshi; Mayuzumi, Yuuki; Kodama, Naoki; Sato, Keiichi


    A personal healthcare system used with cloud computing has been developed. It enables a daily time-series of personal health and lifestyle data to be stored in the cloud through mobile devices. The cloud automatically extracts personally useful information, such as rules and patterns concerning lifestyle and health conditions embedded in the personal big data, by using a data mining technology. The system provides three editions (Diet, Lite, and Pro) corresponding to users' needs.

  6. The CMS Computing System: Successes and Challenges

    CERN Document Server

    Bloom, Kenneth


    Each LHC experiment will produce datasets with sizes of order one petabyte per year. All of this data must be stored, processed, transferred, simulated and analyzed, which requires a computing system of a larger scale than ever mounted for any particle physics experiment, and possibly for any enterprise in the world. I discuss how CMS has chosen to address these challenges, focusing on recent tests of the system that demonstrate the experiment's readiness for producing physics results with the first LHC data.

  7. Integrative Genomics and Computational Systems Medicine

    Energy Technology Data Exchange (ETDEWEB)

    McDermott, Jason E.; Huang, Yufei; Zhang, Bing; Xu, Hua; Zhao, Zhongming


    The exponential growth in generation of large amounts of genomic data from biological samples has driven the emerging field of systems medicine. This field is promising because it improves our understanding of disease processes at the systems level. However, the field is still in its young stage. There exists a great need for novel computational methods and approaches to effectively utilize and integrate various omics data.

  8. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C


    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  9. Computational requirements for on-orbit identification of space systems (United States)

    Hadaegh, Fred Y.


    For the future space systems, on-orbit identification (ID) capability will be required to complement on-orbit control, due to the fact that the dynamics of large space structures, spacecrafts, and antennas will not be known sufficiently from ground modeling and testing. The computational requirements for ID of flexible structures such as the space station (SS) or the large deployable reflectors (LDR) are however, extensive due to the large number of modes, sensors, and actuators. For these systems the ID algorithm operations need not be computed in real-time, only in near real-time, or an appropriate mission time. Consequently the space systems will need advanced processors and efficient parallel processing algorithm design and architectures to implement the identification algorithms in near real-time. The MAX computer currently being developed may handle such computational requirements. The purpose is to specify the on-board computational requirements for dynamic and static identification for large space structures. The computational requirements for six ID algorithms are presented in the context of three examples: the JPL/AFAL ground antenna facility, the space station (SS), and the large deployable reflector (LDR).

  10. Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable

    Energy Technology Data Exchange (ETDEWEB)

    Schuller, Ivan K. [Univ. of California, San Diego, CA (United States); Stevens, Rick [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Chicago, IL (United States); Pino, Robinson [Dept. of Energy (DOE) Office of Science, Washington, DC (United States); Pechan, Michael [Dept. of Energy (DOE) Office of Science, Washington, DC (United States)


    Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS based technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.

  11. Cluster Computing for Embedded/Real-Time Systems (United States)

    Katz, D.; Kepner, J.


    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  12. Landauer Bound for Analog Computing Systems

    CERN Document Server

    Diamantini, M Cristina; Trugenberger, Carlo A


    By establishing a relation between information erasure and continuous phase transitions we generalise the Landauer bound to analog computing systems. The entropy production per degree of freedom during erasure of an analog variable (reset to standard value) is given by the logarithm of the configurational volume measured in units of its minimal quantum. As a consequence every computation has to be carried on with a finite number of bits and infinite precision is forbidden by the fundamental laws of physics, since it would require an infinite amount of energy.

  13. Landauer bound for analog computing systems (United States)

    Diamantini, M. Cristina; Gammaitoni, Luca; Trugenberger, Carlo A.


    By establishing a relation between information erasure and continuous phase transitions we generalize the Landauer bound to analog computing systems. The entropy production per degree of freedom during erasure of an analog variable (reset to standard value) is given by the logarithm of the configurational volume measured in units of its minimal quantum. As a consequence, every computation has to be carried on with a finite number of bits and infinite precision is forbidden by the fundamental laws of physics, since it would require an infinite amount of energy.

  14. International Conference on Soft Computing Systems

    CERN Document Server

    Panigrahi, Bijaya


    The book is a collection of high-quality peer-reviewed research papers presented in International Conference on Soft Computing Systems (ICSCS 2015) held at Noorul Islam Centre for Higher Education, Chennai, India. These research papers provide the latest developments in the emerging areas of Soft Computing in Engineering and Technology. The book is organized in two volumes and discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

  15. Embedded systems for supporting computer accessibility. (United States)

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio


    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

  16. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter


    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  17. A universal computer and interface system for the disabled (UNICAID). (United States)

    Bolton, M P; Taylor, A C


    A prototype text handling and computing system using microprocessor technology has been developed for the severely handicapped user. The input uses a serial 'suck-puff' code (TWC code version DR2). In addition to the usual video display and full editing and printout facilities of a word processor, the unit acts as a terminal for other computers and can also function as a stand-alone microcomputer using commercial software. The 'bus' structure allows simple expansion to incorporate additional functions, and a remote control link is used for the input. The prototype is currently being evaluated by a tetraplegic student.

  18. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue (United States)

    Zornetzer, Steve; Gage, Douglas


    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  19. Nature-inspired computing for control systems

    CERN Document Server


    The book presents recent advances in nature-inspired computing, giving a special emphasis to control systems applications. It reviews different techniques used for simulating physical, chemical, biological or social phenomena at the purpose of designing robust, predictive and adaptive control strategies. The book is a collection of several contributions, covering either more general approaches in control systems, or methodologies for control tuning and adaptive controllers, as well as exciting applications of nature-inspired techniques in robotics. On one side, the book is expected to motivate readers with a background in conventional control systems to try out these powerful techniques inspired by nature. On the other side, the book provides advanced readers with a deeper understanding of the field and a broad spectrum of different methods and techniques. All in all, the book is an outstanding, practice-oriented reference guide to nature-inspired computing addressing graduate students, researchers and practi...

  20. Biological Computation as the Revolution of Complex Engineered Systems

    CERN Document Server

    Gómez-Cruz, Nelson Alfonso


    Provided that there is no theoretical frame for complex engineered systems (CES) as yet, this paper claims that bio-inspired engineering can help provide such a frame. Within CES bio-inspired systems play a key role. The disclosure from bio-inspired systems and biological computation has not been sufficiently worked out, however. Biological computation is to be taken as the processing of information by living systems that is carried out in polynomial time, i.e., efficiently; such processing however is grasped by current science and research as an intractable problem (for instance, the protein folding problem). A remark is needed here: P versus NP problems should be well defined and delimited but biological computation problems are not. The shift from conventional engineering to bio-inspired engineering needs bring the subject (or problem) of computability to a new level. Within the frame of computation, so far, the prevailing paradigm is still the Turing-Church thesis. In other words, conventional engineering...


    Directory of Open Access Journals (Sweden)

    MILDEOVÁ, Stanislava


    Full Text Available When seeking solutions to current problems in the field of computer science – and other fields – we encounter situations where traditional approaches no longer bring the desired results. Our cognitive skills also limit the implementation of reliable mental simulation within the basic set of relations. The world around us is becoming more complex and mutually interdependent, and this is reflected in the demands on computer support. Thus, in today’s education and science in the field of computer science and all other disciplines and areas of life need to address the issue of the paradigm shift, which is generally accepted by experts. The goal of the paper is to present the systems thinking that facilitates and extends the understanding of the world through relations and linkages. Moreover, the paper introduces the essence of systems thinking and the possibilities to achieve mental a shift toward systems thinking skills. At the same time, the link between systems thinking and functional literacy is presented. We adopted the “Bathtub Test” from the variety of systems thinking tests that allow people to assess the understanding of basic systemic concepts, in order to assess the level of systems thinking. University students (potential information managers were the examined subjects of the examination of systems thinking that was conducted over a longer time period and whose aim was to determine the status of systems thinking. . The paper demonstrates that some pedagogical concepts and activities, in our case the subject of System Dynamics that leads to the appropriate integration of systems thinking in education. There is some evidence that basic knowledge of system dynamics and systems thinking principles will affect students, and their thinking will contribute to an improved approach to solving problems of computer science both in theory and practice.

  2. Decomposability queueing and computer system applications

    CERN Document Server

    Courtois, P J


    Decomposability: Queueing and Computer System Applications presents a set of powerful methods for systems analysis. This 10-chapter text covers the theory of nearly completely decomposable systems upon which specific analytic methods are based.The first chapters deal with some of the basic elements of a theory of nearly completely decomposable stochastic matrices, including the Simon-Ando theorems and the perturbation theory. The succeeding chapters are devoted to the analysis of stochastic queuing networks that appear as a type of key model. These chapters also discuss congestion problems in

  3. Computer-aided Analysis of Phisiological Systems

    Directory of Open Access Journals (Sweden)

    Balázs Benyó


    Full Text Available This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and stateclassification to detect brain ischemia by means of EEG signal processing; Detection ofbreathing disorders like apnea and hypopnea; Molecular biology studies with DNA-chips;Evaluation of the cry of normal hearing and hard of hearing infants.

  4. Construction and assessment of hierarchical edge elements for three-dimensional computations of eddy currents

    Energy Technology Data Exchange (ETDEWEB)

    Midtgaard, Ole-Morten


    This thesis considers the feasibility of doing calculations to optimize electrical machines without the need to build expensive prototypes. It deals with the construction and assessment of new, hierarchical, hexahedral edge elements for three-dimensional computations of eddy currents with the electric vector potential formulation. The new elements, five in all, gave up to second-order approximations for both the magnetic field and the current density. Theoretical arguments showed these elements to be more economical for a given polynomial order of the approximated fields than the serendipity family of nodal elements. Further it was pointed out how the support of a source field computed by using edge elements could be made very small provided that a proper spanning tree was used in the edge element mesh. This was exploited for the voltage forcing technique, where source fields were used as basis functions, with unknown total currents in voltage forced conductors as degrees of freedom. The practical assessment of the edge elements proved the accuracy to improve with increasing polynomial order, both for local and global quantities. The most economical element was, however, one giving only complete first-order approximations for both fields. Further, the edge elements turned out to be better than the nodal elements also in practice. For the voltage forcing technique, source field basis functions which had small support, resulted in large reduction of the CPU-time for solving the main equation system, compared to source fields which had large support. The new elements can be used in a p-type adaptive scheme, and they should also be applicable for other tangentially continuous field problems. 67 refs., 34 figs., 10 tabs.

  5. A regional climatology of the Humboldt Current System (United States)

    Grados Quispe, M.; Chaigneau, A.; Blanco, J.; Vasquez, L.; Dominguez, N.


    A 3-dimensional, high-resolution, regional climatology of the Humboldt Current System (HCS) north of 25°S is presented. The methodology is based on a four-dimensional ocean interpolation scheme using locally weighted least square fitting, as developed by Dunn and Ridgway [2001] and Ridgway et al. [2002] in the Australian Seas. The method is applied to all the available historical profiles from the National Oceanographic Data Center [WOD05, Boyer et al., 2006], ARGO buoy profiles [] for 2000-2007 and historical in situ long-term information from the Peruvian Marine Research Institute (IMARPE) and Fisheries Development Institute (IFOP) for the period 1960-2008. The regional climatology, which extends from the equator to 25°S and from the coast to 8° offshore with a resolution of 0.1°x0.1°, is thus constructed from more than 70 000 temperature profiles, 38 000 salinity profiles and 43 000 oxygen profiles to form a seasonal climatology of temperature and salinity along Peru and northern Chile. The resulting maps depict interesting small-scales coastal properties such as clear distinct upwelling centers and frontal zones. Geostrophic currents relative to 500 m depth are also computed from the density field, highlighting new circulation features. This study provides a contemporaneous view of the circulation and the water masses characteristics in the Humboldt Current System at seasonal scales. This regional climatology represents coastal boundary features (upwelling cells, frontal regions) better than other climatologies. In view of on-going international research efforts to understand the coastal upwelling and coastal currents in the southern ocean off Peru, the main characteristics of the upwelling cell, currents and coastal winds variability of the Pisco (13°S)-San Juan (15°S) region are presented. This improved gridded product is expected to be used for initializing and validating high resolution regional numerical models.

  6. Applicability of Computational Systems Biology in Toxicology

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Hadrup, Niels; Audouze, Karine Marie Laure


    be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method......Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources...... and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search...

  7. Comparing current cluster, massively parallel, and accelerated systems

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Kevin J [Los Alamos National Laboratory; Davis, Kei [Los Alamos National Laboratory; Hoisie, Adolfy [Los Alamos National Laboratory; Kerbyson, Darren J [Los Alamos National Laboratory; Pakin, Scott [Los Alamos National Laboratory; Lang, Mike [Los Alamos National Laboratory; Sancho Pitarch, Jose C [Los Alamos National Laboratory


    Currently there is large architectural diversity in high perfonnance computing systems. They include 'commodity' cluster systems that optimize per-node performance for small jobs, massively parallel processors (MPPs) that optimize aggregate perfonnance for large jobs, and accelerated systems that optimize both per-node and aggregate performance but only for applications custom-designed to take advantage of such systems. Because of these dissimilarities, meaningful comparisons of achievable performance are not straightforward. In this work we utilize a methodology that combines both empirical analysis and performance modeling to compare clusters (represented by a 4,352-core IB cluster), MPPs (represented by a 147,456-core BG/P), and accelerated systems (represented by the 129,600-core Roadrunner) across a workload of four applications. Strengths of our approach include the ability to compare architectures - as opposed to specific implementations of an architecture - attribute each application's performance bottlenecks to characteristics unique to each system, and to explore performance scenarios in advance of their availability for measurement. Our analysis illustrates that application performance is essentially unrelated to relative peak performance but that application performance can be both predicted and explained using modeling.

  8. Advances in computational design and analysis of airbreathing propulsion systems (United States)

    Klineberg, John M.


    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  9. Low Power Dynamic Scheduling for Computing Systems

    CERN Document Server

    Neely, Michael J


    This paper considers energy-aware control for a computing system with two states: "active" and "idle." In the active state, the controller chooses to perform a single task using one of multiple task processing modes. The controller then saves energy by choosing an amount of time for the system to be idle. These decisions affect processing time, energy expenditure, and an abstract attribute vector that can be used to model other criteria of interest (such as processing quality or distortion). The goal is to optimize time average system performance. Applications of this model include a smart phone that makes energy-efficient computation and transmission decisions, a computer that processes tasks subject to rate, quality, and power constraints, and a smart grid energy manager that allocates resources in reaction to a time varying energy price. The solution methodology of this paper uses the theory of optimization for renewal systems developed in our previous work. This paper is written in tutorial form and devel...

  10. Applicability of computational systems biology in toxicology. (United States)

    Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie


    Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research.

  11. Superconducting Current Leads for Cryogenic Systems Project (United States)

    National Aeronautics and Space Administration — Space flight cryocoolers will be able to handle limited heat loads at their expected operating temperatures and the current leads may be the dominant contributor to...

  12. Computational intelligence in gait research: a perspective on current applications and future challenges. (United States)

    Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu


    Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.

  13. Fuzzy Controller based Neutral Current Harmonic Suppression in Distribution System

    Directory of Open Access Journals (Sweden)

    T.Guna Sekar


    Full Text Available Recent surveys of three-phase four-wire electric systems, buildings and industrial plants with computers and non-linear loads shows the excessive currents in the neutral conductor. This is mainly due to unbalancing system and non-linear loads. Third order harmonics are much dominant in the neutral conductor due to the presence of zero sequence components. In response to this concern, this paper presents a concept of series active filter scheme to suppress the neutral current harmonics to reduce the burden of the secondary of the distribution transformer. In this scheme, the series active filteris connected in series with the neutral conductor to eliminate the zero sequence components in the neutral conductor. In this paper, Fuzzy based controller is used to extract the harmonic component in the neutral conductor. The proposed method improves the overall performance of the system and eliminates the burden of the neutral conductor. To validate the proposed simulation results, a scale-down prototype experimental model is developed.

  14. Computer Aided Facial Prosthetics Manufacturing System

    Directory of Open Access Journals (Sweden)

    Peng H.K.


    Full Text Available Facial deformities can impose burden to the patient. There are many solutions for facial deformities such as plastic surgery and facial prosthetics. However, current fabrication method of facial prosthetics is high-cost and time consuming. This study aimed to identify a new method to construct a customized facial prosthetic. A 3D scanner, computer software and 3D printer were used in this study. Results showed that the new developed method can be used to produce a customized facial prosthetics. The advantages of the developed method over the conventional process are low cost, reduce waste of material and pollution in order to meet the green concept.

  15. Interactive computer-enhanced remote viewing system

    Energy Technology Data Exchange (ETDEWEB)

    Tourtellott, J.A.; Wagner, J.F. [Mechanical Technology Incorporated, Latham, NY (United States)


    Remediation activities such as decontamination and decommissioning (D&D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically.

  16. Cloud Computing Security in Business Information Systems

    CERN Document Server

    Ristov, Sasko; Kostoska, Magdalena


    Cloud computing providers' and customers' services are not only exposed to existing security risks, but, due to multi-tenancy, outsourcing the application and data, and virtualization, they are exposed to the emergent, as well. Therefore, both the cloud providers and customers must establish information security system and trustworthiness each other, as well as end users. In this paper we analyze main international and industrial standards targeting information security and their conformity with cloud computing security challenges. We evaluate that almost all main cloud service providers (CSPs) are ISO 27001:2005 certified, at minimum. As a result, we propose an extension to the ISO 27001:2005 standard with new control objective about virtualization, to retain generic, regardless of company's type, size and nature, that is, to be applicable for cloud systems, as well, where virtualization is its baseline. We also define a quantitative metric and evaluate the importance factor of ISO 27001:2005 control objecti...

  17. Thermoelectric property measurements with computer controlled systems (United States)

    Chmielewski, A. B.; Wood, C.


    A joint JPL-NASA program to develop an automated system to measure the thermoelectric properties of newly developed materials is described. Consideration is given to the difficulties created by signal drift in measurements of Hall voltage and the Large Delta T Seebeck coefficient. The benefits of a computerized system were examined with respect to error reduction and time savings for human operators. It is shown that the time required to measure Hall voltage can be reduced by a factor of 10 when a computer is used to fit a curve to the ratio of the measured signal and its standard deviation. The accuracy of measurements of the Large Delta T Seebeck coefficient and thermal diffusivity was also enhanced by the use of computers.

  18. Checkpoint triggering in a computer system (United States)

    Cher, Chen-Yong


    According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.


    Directory of Open Access Journals (Sweden)

    Anis ISMAIL


    Full Text Available We present new system architecture, a distributed framework designed to support pervasive computingapplications. We propose a new architecture consisting of a search engine and peripheral clients thataddresses issues in scalability, data sharing, data transformation and inherent platform heterogeneity. Keyfeatures of our application are a type-aware data transport that is capable of extract data, and presentdata through handheld devices (PDA (personal digital assistant, mobiles, etc. Pervasive computing usesweb technology, portable devices, wireless communications and nomadic or ubiquitous computing systems.The web and the simple standard HTTP protocol that it is based on, facilitate this kind of ubiquitousaccess. This can be implemented on a variety of devices - PDAs, laptops, information appliances such asdigital cameras and printers. Mobile users get transparent access to resources outside their currentenvironment. We discuss our system’s architecture and its implementation. Through experimental study,we show reasonable performance and adaptation for our system’s implementation for the mobile devices.


    CERN Multimedia

    I. Fisk


    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  1. Music Genre Classification Systems - A Computational Approach


    Ahrendt, Peter; Hansen, Lars Kai


    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  2. Research on Dynamic Distributed Computing System for Small and Medium-Sized Computer Clusters

    Institute of Scientific and Technical Information of China (English)

    Le Kang; Jianliang Xu; Feng Liu


      Distributed computing system is a science by which a complex task that need for large amount of computation can be divided into small pieces and calculated by more than one computer,and we can get the final result according to results from each computer.This paper considers a distributed computing system running in the small and medium-sized computer clusters to solve the problem that single computer has a low efficiency,and improve the efficiency of large-scale computing.The experiments show that the system can effectively improve the efficiency and it is a viable program.

  3. Performance evaluation of a computed radiography system

    Energy Technology Data Exchange (ETDEWEB)

    Roussilhe, J.; Fallet, E. [Carestream Health France, 71 - Chalon/Saone (France); Mango, St.A. [Carestream Health, Inc. Rochester, New York (United States)


    Computed radiography (CR) standards have been formalized and published in Europe and in the US. The CR system classification is defined in those standards by - minimum normalized signal-to-noise ratio (SNRN), and - maximum basic spatial resolution (SRb). Both the signal-to-noise ratio (SNR) and the contrast sensitivity of a CR system depend on the dose (exposure time and conditions) at the detector. Because of their wide dynamic range, the same storage phosphor imaging plate can qualify for all six CR system classes. The exposure characteristics from 30 to 450 kV, the contrast sensitivity, and the spatial resolution of the KODAK INDUSTREX CR Digital System have been thoroughly evaluated. This paper will present some of the factors that determine the system's spatial resolution performance. (authors)

  4. Incorporating core hysteresis properties in three-dimensional computations of transformer inrush current forces (United States)

    Adly, A. A.; Hanafy, H. H.


    It is well known that transformer inrush currents depend upon the core properties, residual flux, switching instant, and the overall circuit parameters. Large transient inrush currents introduce abnormal electromagnetic forces which may destroy the transformer windings. This paper presents an approach through which core hysteresis may be incorporated in three-dimensional computations of transformer inrush current forces. Details of the approach, measurements, and simulations for a shell-type transformer are given in the paper.

  5. TMX-U computer system in evolution (United States)

    Casper, T. A.; Bell, H.; Brown, M.; Gorvad, M.; Jenkins, S.; Meyer, W.; Moller, J.; Perkins, D.


    Over the past three years, the total TMX-U diagnostic data base has grown to exceed 10 Mbytes from over 1300 channels; roughly triple the originally designed size. This acquisition and processing load has resulted in an experiment repetition rate exceeding 10 min per shot using the five original Hewlett-Packard HP-1000 computers with their shared disks. Our new diagnostics tend to be multichannel instruments, which, in our environment, can be more easily managed using local computers. For this purpose, we are using HP series 9000 computers for instrument control, data acquisition, and analysis. Fourteen such systems are operational with processed format output exchanged via a shared resource manager. We are presently implementing the necessary hardware and software changes to create a local area network allowing us to combine the data from these systems with our main data archive. The expansion of our diagnostic system using the parallel acquisition and processing concept allows us to increase our data base with a minimum of impact on the experimental repetition rate.

  6. Physical Optics Based Computational Imaging Systems (United States)

    Olivas, Stephen Joseph

    There is an ongoing demand on behalf of the consumer, medical and military industries to make lighter weight, higher resolution, wider field-of-view and extended depth-of-focus cameras. This leads to design trade-offs between performance and cost, be it size, weight, power, or expense. This has brought attention to finding new ways to extend the design space while adhering to cost constraints. Extending the functionality of an imager in order to achieve extraordinary performance is a common theme of computational imaging, a field of study which uses additional hardware along with tailored algorithms to formulate and solve inverse problems in imaging. This dissertation details four specific systems within this emerging field: a Fiber Bundle Relayed Imaging System, an Extended Depth-of-Focus Imaging System, a Platform Motion Blur Image Restoration System, and a Compressive Imaging System. The Fiber Bundle Relayed Imaging System is part of a larger project, where the work presented in this thesis was to use image processing techniques to mitigate problems inherent to fiber bundle image relay and then, form high-resolution wide field-of-view panoramas captured from multiple sensors within a custom state-of-the-art imager. The Extended Depth-of-Focus System goals were to characterize the angular and depth dependence of the PSF of a focal swept imager in order to increase the acceptably focused imaged scene depth. The goal of the Platform Motion Blur Image Restoration System was to build a system that can capture a high signal-to-noise ratio (SNR), long-exposure image which is inherently blurred while at the same time capturing motion data using additional optical sensors in order to deblur the degraded images. Lastly, the objective of the Compressive Imager was to design and build a system functionally similar to the Single Pixel Camera and use it to test new sampling methods for image generation and to characterize it against a traditional camera. These computational

  7. Large fluctuations of the macroscopic current in diffusive systems: A numerical test of the additivity principle (United States)

    Hurtado, Pablo I.; Garrido, Pedro L.


    Most systems, when pushed out of equilibrium, respond by building up currents of locally conserved observables. Understanding how microscopic dynamics determines the averages and fluctuations of these currents is one of the main open problems in nonequilibrium statistical physics. The additivity principle is a theoretical proposal that allows to compute the current distribution in many one-dimensional nonequilibrium systems. Using simulations, we validate this conjecture in a simple and general model of energy transport, both in the presence of a temperature gradient and in canonical equilibrium. In particular, we show that the current distribution displays a Gaussian regime for small current fluctuations, as prescribed by the central limit theorem, and non-Gaussian (exponential) tails for large current deviations, obeying in all cases the Gallavotti-Cohen fluctuation theorem. In order to facilitate a given current fluctuation, the system adopts a well-defined temperature profile different from that of the steady state and in accordance with the additivity hypothesis predictions. System statistics during a large current fluctuation is independent of the sign of the current, which implies that the optimal profile (as well as higher-order profiles and spatial correlations) are invariant upon current inversion. We also demonstrate that finite-time joint fluctuations of the current and the profile are well described by the additivity functional. These results suggest the additivity hypothesis as a general and powerful tool to compute current distributions in many nonequilibrium systems.

  8. Mediterranea Forecasting System: a focus on wave-current coupling (United States)

    Clementi, Emanuela; Delrosso, Damiano; Pistoia, Jenny; Drudi, Massimiliano; Fratianni, Claudia; Grandi, Alessandro; Pinardi, Nadia; Oddo, Paolo; Tonani, Marina


    The Mediterranean Forecasting System (MFS) is a numerical ocean prediction system that produces analyses, reanalyses and short term forecasts for the entire Mediterranean Sea and its Atlantic Ocean adjacent areas. MFS became operational in the late 90's and has been developed and continuously improved in the framework of a series of EU and National funded programs and is now part of the Copernicus Marine Service. The MFS is composed by the hydrodynamic model NEMO (Nucleus for European Modelling of the Ocean) 2-way coupled with the third generation wave model WW3 (WaveWatchIII) implemented in the Mediterranean Sea with 1/16 horizontal resolution and forced by ECMWF atmospheric fields. The model solutions are corrected by the data assimilation system (3D variational scheme adapted to the oceanic assimilation problem) with a daily assimilation cycle, using a background error correlation matrix varying seasonally and in different sub-regions of the Mediterranean Sea. The focus of this work is to present the latest modelling system upgrades and the related achieved improvements. In order to evaluate the performance of the coupled system a set of experiments has been built by coupling the wave and circulation models that hourly exchange the following fields: the sea surface currents and air-sea temperature difference are transferred from NEMO model to WW3 model modifying respectively the mean momentum transfer of waves and the wind speed stability parameter; while the neutral drag coefficient computed by WW3 model is passed to NEMO that computes the turbulent component. In order to validate the modelling system, numerical results have been compared with in-situ and remote sensing data. This work suggests that a coupled model might be capable of a better description of wave-current interactions, in particular feedback from the ocean to the waves might assess an improvement on the prediction capability of wave characteristics, while suggests to proceed toward a fully

  9. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W


    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  10. Computational modeling of shallow geothermal systems

    CERN Document Server

    Al-Khoury, Rafid


    A Step-by-step Guide to Developing Innovative Computational Tools for Shallow Geothermal Systems Geothermal heat is a viable source of energy and its environmental impact in terms of CO2 emissions is significantly lower than conventional fossil fuels. Shallow geothermal systems are increasingly utilized for heating and cooling of buildings and greenhouses. However, their utilization is inconsistent with the enormous amount of energy available underneath the surface of the earth. Projects of this nature are not getting the public support they deserve because of the uncertainties associated with

  11. Prestandardisation Activities for Computer Based Safety Systems

    DEFF Research Database (Denmark)

    Taylor, J. R.; Bologna, S.; Ehrenberger, W.


    Questions of technical safety become more and more important. Due to the higher complexity of their functions computer based safety systems have special problems. Researchers, producers, licensing personnel and customers have met on a European basis to exchange knowledge and formulate positions....... The Commission of the european Community supports the work. Major topics comprise hardware configuration and self supervision, software design, verification and testing, documentation, system specification and concurrent processing. Preliminary results have been used for the draft of an IEC standard and for some...

  12. Tools for Embedded Computing Systems Software (United States)


    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  13. Current practice in software development for computational neuroscience and how to improve it. (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert


    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  14. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig


    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  15. Current Mode Data Converters for Sensor Systems

    DEFF Research Database (Denmark)

    Jørgensen, Ivan Herald Holger

    This thesis is mainly concerned with data conversion. Especially data conversion using current mode signal processing is treated.A tutorial chapter introducing D/A conversion is presented. In this chapter the effects that cause static and dynamic nonlinearities are discussed along with methods to...

  16. DAQ System of Current Based on MNSR

    Institute of Scientific and Technical Information of China (English)


    The flux or power should be acquired using the detector in the operation of MNSR. As usual, the signal of detector is current, and it is very width range with 10-11-10-6 A. It is hard to satisfy the linearity to amplify this signal by using fix gain

  17. Computer Assisted Surgery and Current Trends in Orthopaedics Research and Total Joint Replacements (United States)

    Amirouche, Farid


    Musculoskeletal research has brought about revolutionary changes in our ability to perform high precision surgery in joint replacement procedures. Recent advances in computer assisted surgery as well better materials have lead to reduced wear and greatly enhanced the quality of life of patients. The new surgical techniques to reduce the size of the incision and damage to underlying structures have been the primary advance toward this goal. These new techniques are known as MIS or Minimally Invasive Surgery. Total hip and knee Arthoplasties are at all time high reaching 1.2 million surgeries per year in the USA. Primary joint failures are usually due to osteoarthristis, rheumatoid arthritis, osteocronis and other inflammatory arthritis conditions. The methods for THR and TKA are critical to initial stability and longevity of the prostheses. This research aims at understanding the fundamental mechanics of the joint Arthoplasty and providing an insight into current challenges in patient specific fitting, fixing, and stability. Both experimental and analytical work will be presented. We will examine Cementless total hip arthroplasty success in the last 10 years and how computer assisted navigation is playing in the follow up studies. Cementless total hip arthroplasty attains permanent fixation by the ingrowth of bone into a porous coated surface. Loosening of an ingrown total hip arthroplasty occurs as a result of osteolysis of the periprosthetic bone and degradation of the bone prosthetic interface. The osteolytic process occurs as a result of polyethylene wear particles produced by the metal polyethylene articulation of the prosthesis. The total hip arthroplasty is a congruent joint and the submicron wear particles produced are phagocytized by macrophages initiating an inflammatory cascade. This cascade produces cytokines ultimately implicated in osteolysis. Resulting bone loss both on the acetabular and femoral sides eventually leads to component instability. As

  18. Computer-Assisted Photo Interpretation System (United States)

    Niedzwiadek, Harry A.


    A computer-assisted photo interpretation research (CAPIR) system has been developed at the U.S. Army Engineer Topographic Laboratories (ETL), Fort Belvoir, Virginia. The system is based around the APPS-IV analytical plotter, a photogrammetric restitution device that was designed and developed by Autometric specifically for interactive, computerized data collection activities involving high-resolution, stereo aerial photographs. The APPS-IV is ideally suited for feature analysis and feature extraction, the primary functions of a photo interpreter. The APPS-IV is interfaced with a minicomputer and a geographic information system called AUTOGIS. The AUTOGIS software provides the tools required to collect or update digital data using an APPS-IV, construct and maintain a geographic data base, and analyze or display the contents of the data base. Although the CAPIR system is fully functional at this time, considerable enhancements are planned for the future.

  19. Computational systems biology in cancer brain metastasis. (United States)

    Peng, Huiming; Tan, Hua; Zhao, Weiling; Jin, Guangxu; Sharma, Sambad; Xing, Fei; Watabe, Kounosuke; Zhou, Xiaobo


    Brain metastases occur in 20-40% of patients with advanced malignancies. A better understanding of the mechanism of this disease will help us to identify novel therapeutic strategies. In this review, we will discuss the systems biology approaches used in this area, including bioinformatics and mathematical modeling. Bioinformatics has been used for identifying the molecular mechanisms driving brain metastasis and mathematical modeling methods for analyzing dynamics of a system and predicting optimal therapeutic strategies. We will illustrate the strategies, procedures, and computational techniques used for studying systems biology in cancer brain metastases. We will give examples on how to use a systems biology approach to analyze a complex disease. Some of the approaches used to identify relevant networks, pathways, and possibly biomarkers in metastasis will be reviewed into details. Finally, certain challenges and possible future directions in this area will also be discussed.

  20. A computer-aided continuous assessment system

    Directory of Open Access Journals (Sweden)

    B. C.H. Turton


    Full Text Available Universities within the United Kingdom have had to cope with a massive expansion in undergraduate student numbers over the last five years (Committee of Scottish University Principals, 1993; CVCP Briefing Note, 1994. In addition, there has been a move towards modularization and a closer monitoring of a student's progress throughout the year. Since the price/performance ratio of computer systems has continued to improve, Computer- Assisted Learning (CAL has become an attractive option. (Fry, 1990; Benford et al, 1994; Laurillard et al, 1994. To this end, the Universities Funding Council (UFQ has funded the Teaching and Learning Technology Programme (TLTP. However universities also have a duty to assess as well as to teach. This paper describes a Computer-Aided Assessment (CAA system capable of assisting in grading students and providing feedback. In this particular case, a continuously assessed course (Low-Level Languages of over 100 students is considered. Typically, three man-days are required to mark one assessed piece of coursework from the students in this class. Any feedback on how the questions were dealt with by the student are of necessity brief. Most of the feedback is provided in a tutorial session that covers the pitfalls encountered by the majority of the students.


    Directory of Open Access Journals (Sweden)

    Nesterov G. D.


    Full Text Available The work is devoted to the topical issue of increasing the productivity of computers. It has an experimental character. Therefore, the description of a number of the carried-out tests and the analysis of their results is offered. Previously basic characteristics of modules of the computer for the regular mode of functioning are provided in the article. Further the technique of regulating their parameters in the course of experiment is described. Thus the special attention is paid to observing the necessary thermal mode in order to avoid an undesirable overheat of the central processor. Also, operability of system in the conditions of the increased energy consumption is checked. The most responsible moment thus is regulating the central processor. As a result of the test its optimum tension, frequency and delays of data reading from memory are found. The analysis of stability of characteristics of the RAM, in particular, a condition of its tires in the course of experiment is made. As the executed tests took place within the standard range of characteristics of modules, and, therefore, the margin of safety put in the computer and capacity of system wasn't used, further experiments were made at extreme dispersal in the conditions of air cooling. The received results are also given in the offered article

  2. Evaluating Computer Technology Integration in a Centralized School System (United States)

    Eteokleous, N.


    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  3. Energy efficient hybrid computing systems using spin devices (United States)

    Sharad, Mrigank

    Emerging spin-devices like magnetic tunnel junctions (MTJ's), spin-valves and domain wall magnets (DWM) have opened new avenues for spin-based logic design. This work explored potential computing applications which can exploit such devices for higher energy-efficiency and performance. The proposed applications involve hybrid design schemes, where charge-based devices supplement the spin-devices, to gain large benefits at the system level. As an example, lateral spin valves (LSV) involve switching of nanomagnets using spin-polarized current injection through a metallic channel such as Cu. Such spin-torque based devices possess several interesting properties that can be exploited for ultra-low power computation. Analog characteristic of spin current facilitate non-Boolean computation like majority evaluation that can be used to model a neuron. The magneto-metallic neurons can operate at ultra-low terminal voltage of ˜20mV, thereby resulting in small computation power. Moreover, since nano-magnets inherently act as memory elements, these devices can facilitate integration of logic and memory in interesting ways. The spin based neurons can be integrated with CMOS and other emerging devices leading to different classes of neuromorphic/non-Von-Neumann architectures. The spin-based designs involve `mixed-mode' processing and hence can provide very compact and ultra-low energy solutions for complex computation blocks, both digital as well as analog. Such low-power, hybrid designs can be suitable for various data processing applications like cognitive computing, associative memory, and currentmode on-chip global interconnects. Simulation results for these applications based on device-circuit co-simulation framework predict more than ˜100x improvement in computation energy as compared to state of the art CMOS design, for optimal spin-device parameters.

  4. Highly versatile computer-controlled television detector system (United States)

    Kalata, K.


    A description is presented of a television detector system which has been designed to accommodate a wide range of applications. It is currently being developed for use in X-ray diffraction, X-ray astrophysics, and electron microscopy, but it is also well suited for astronomical observations. The image can be integrated in a large, high-speed memory system, in the memory of a computer system, or the target of the TV tube or CCD array. The detector system consists of a continuously scanned, intensified SIT vidicon with scan and processing electronics which generate a digital image that is integrated in the detector memory. Attention is given to details regarding the camera system, scan control and image processing electronics, the memory system, and aspects of detector performance.

  5. Software Safety Risk in Legacy Safety-Critical Computer Systems (United States)

    Hill, Janice L.; Baggs, Rhoda


    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  6. An automated computer misuse detection system for UNICOS

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Neuman, M.C.; Simmonds, D.D.; Stallings, C.A.; Thompson, J.L.; Christoph, G.G.


    An effective method for detecting computer misuse is the automatic monitoring and analysis of on-line user activity. This activity is reflected in the system audit record, in the system vulnerability posture, and in other evidence found through active testing of the system. During the last several years we have implemented an automatic misuse detection system at Los Alamos. This is the Network Anomaly Detection and Intrusion Reporter (NADIR). We are currently expanding NADIR to include processing of the Cray UNICOS operating system. This new component is called the UNICOS Realtime NADIR, or UNICORN. UNICORN summarizes user activity and system configuration in statistical profiles. It compares these profiles to expert rules that define security policy and improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. The first phase of UNICORN development is nearing completion, and will be operational in late 1994.

  7. Highlights of the GURI hydroelectric plant computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Dal Monte, R.; Banakar, H.; Hoffman, R.; Lebeau, M.; Schroeder, R.


    The GURI power plant on the Caroni river in Venezuela has 20 generating units with a total capacity of 10,000 MW, the largest currently operating in the world. The GURI Computer Control System (GCS) provides for comprehensive operation management of the entire power plant and the adjacent switchyards. This article describes some highlights of the functions of the state-of-the-art system. The topics considered include the operating modes of the remote terminal units (RTUs), automatic start/stop of generating units, RTU closed-loop control, automatic generation and voltage control, unit commitment, operator training stimulator, and maintenance management.

  8. Analysis of Sqp current systems by using corrected geomagneticcoordinates

    Institute of Scientific and Technical Information of China (English)


    The Spq equivalent current system of the quiet day geomagnetic variation in the polar region is very complicated. It is composed of several currents, such as the ionospheric dynamo current and the auroral electrojet caused by the field-aligned current. Spq is unsymmetrical in both polar regions. In this paper, the Spq current systems are analyzed in the corrected geomagnetic coordinates (CGM) instead of the conventional geomagnetic coordinates (GM), and the symmetries of the Spq current indifferent systems are compared. Then the causes of Spq asymmetry in the GM coordinates are discussed; the effects of each component in Spq are determined.

  9. Visual computing model for immune system and medical system. (United States)

    Gong, Tao; Cao, Xinxue; Xiong, Qin


    Natural immune system is an intelligent self-organizing and adaptive system, which has a variety of immune cells with different types of immune mechanisms. The mutual cooperation between the immune cells shows the intelligence of this immune system, and modeling this immune system has an important significance in medical science and engineering. In order to build a comprehensible model of this immune system for better understanding with the visualization method than the traditional mathematic model, a visual computing model of this immune system was proposed and also used to design a medical system with the immune system, in this paper. Some visual simulations of the immune system were made to test the visual effect. The experimental results of the simulations show that the visual modeling approach can provide a more effective way for analyzing this immune system than only the traditional mathematic equations.

  10. The role of the computer in science fair projects: Current status and potential

    Energy Technology Data Exchange (ETDEWEB)

    Trainor, M.S.


    The need for more students to enter the field of science is acute in the nation, and science fair projects provide a motivational mechanism to entice students into pursuing scientific careers. Computers play a major role in science today. Because computers are a major source of entertainment for our children, one would expect them to play a significant role in many science fair projects. This study investigated current and potential uses of computers in science fair projects and incorporated an informal case study of scientists, teachers, and students involved in science fair projects from a highly scientific community. Interviews, a survey, and observations were conducted. Results indicated that most projects either do not use or inadequately use computers and that a significant potential for more effective use of computers for science fair projects exists.

  11. Visual computing scientific visualization and imaging systems

    CERN Document Server


    This volume aims to stimulate discussions on research involving the use of data and digital images as an understanding approach for analysis and visualization of phenomena and experiments. The emphasis is put not only on graphically representing data as a way of increasing its visual analysis, but also on the imaging systems which contribute greatly to the comprehension of real cases. Scientific Visualization and Imaging Systems encompass multidisciplinary areas, with applications in many knowledge fields such as Engineering, Medicine, Material Science, Physics, Geology, Geographic Information Systems, among others. This book is a selection of 13 revised and extended research papers presented in the International Conference on Advanced Computational Engineering and Experimenting -ACE-X conferences 2010 (Paris), 2011 (Algarve), 2012 (Istanbul) and 2013 (Madrid). The examples were particularly chosen from materials research, medical applications, general concepts applied in simulations and image analysis and ot...

  12. Epilepsy analytic system with cloud computing. (United States)

    Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei


    Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.

  13. Artificial Tutoring Systems: What Computers Can and Can't Know. (United States)

    Frick, Theodore W.


    Maccia's epistemology of intelligent natural systems implies that computer systems must develop qualitative intelligence before knowledge representation and natural language understanding can be achieved. Emotion and sensation--capabilities which computers do not currently possess are vital to the growth of the mind (Stanley I. Greenspan).…

  14. Current dental adhesives systems. A narrative review. (United States)

    Milia, Egle; Cumbo, Enzo; Cardoso, Rielson Jose A; Gallina, Giuseppe


    Adhesive dentistry is based on the development of materials which establish an effective bond with the tooth tissues. In this context, adhesive systems have attracted considerable research interest in recent years. Successful adhesive bonding depends on the chemistry of the adhesive, on appropriate clinical handling of the material as well as on the knowledge of the morphological changes caused on dental tissue by different bonding procedures. This paper outlines the status of contemporary adhesive systems, with particular emphasis on chemical characteristics and mode of interaction of the adhesives with enamel and dentinal tissues. Dental adhesives are used for several clinical applications and they can be classified based on the clinical regimen in "etch-and-rinse adhesives" and "self-etch adhesives". Other important considerations concern the different anatomical characteristics of enamel and dentine which are involved in the bonding procedures that have also implications for the technique used as well as for the quality of the bond. Etch-and-rinse adhesive systems generally perform better on enamel than self-etching systems which may be more suitable for bonding to dentine. In order to avoid a possible loss of the restoration, secondary caries or pulp damage due to bacteria penetration or due to cytotoxicity effects of eluted adhesive components, careful consideration of several factors is essential in selecting the suitable bonding procedure and adhesive system for the individual patient situation.

  15. 10 CFR 35.457 - Therapy-related computer systems. (United States)


    ... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.457 Section 35.457... Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning system of therapy-related computer systems in accordance with published protocols accepted by...

  16. Computer-controlled stapling system for lung surgery. (United States)

    Gossot, Dominique; Nana, Albert


    Current disposable hand-actuated staplers may pose reliability problems, especially with respect to the measurement of tissue thickness. We have evaluated a newly developed stapler with a computer-controlled placement of staples. The SurgAssist system (Power Medical Interventions, New Hope, PA) is comprised of a console that houses a computer, a remote control unit, a flexible shaft, and a cartridge. The remote control unit has two uses: (1) controlling the accurate placement of the cartridge by orientating the tip of the flexible shaft, and (2) controlling the closure of the stapler and the firing. Each cartridge contains a programmed electronic device that triggers the activation of the appropriate program in the main microprocessor. The compression level on the tissue is determined by the computer. The system was used in a consecutive series of 38 patients, 26 times during open lung surgery and 12 times during video-assisted thoracic surgery. The following open procedures were performed: three pneumonectomies, 15 lobectomies, three segmentectomies, and five wedge resections. The following video-assisted thoracic surgery procedures were performed: eight wedge resections and four bullectomies for pneumothorax. There was no stapling failure and no complication related to the use of the stapler. During video-assisted thoracic surgery, some ergonomic problems were encountered that will be overcome by redesign. The computer-controlled stapling system may significantly improve tissue approximation during open and video-assisted thoracic surgery.

  17. Knowledge and intelligent computing system in medicine. (United States)

    Pandey, Babita; Mishra, R B


    Knowledge-based systems (KBS) and intelligent computing systems have been used in the medical planning, diagnosis and treatment. The KBS consists of rule-based reasoning (RBR), case-based reasoning (CBR) and model-based reasoning (MBR) whereas intelligent computing method (ICM) encompasses genetic algorithm (GA), artificial neural network (ANN), fuzzy logic (FL) and others. The combination of methods in KBS such as CBR-RBR, CBR-MBR and RBR-CBR-MBR and the combination of methods in ICM is ANN-GA, fuzzy-ANN, fuzzy-GA and fuzzy-ANN-GA. The combination of methods from KBS to ICM is RBR-ANN, CBR-ANN, RBR-CBR-ANN, fuzzy-RBR, fuzzy-CBR and fuzzy-CBR-ANN. In this paper, we have made a study of different singular and combined methods (185 in number) applicable to medical domain from mid 1970s to 2008. The study is presented in tabular form, showing the methods and its salient features, processes and application areas in medical domain (diagnosis, treatment and planning). It is observed that most of the methods are used in medical diagnosis very few are used for planning and moderate number in treatment. The study and its presentation in this context would be helpful for novice researchers in the area of medical expert system.

  18. Systems, methods and computer-readable media for modeling cell performance fade of rechargeable electrochemical devices (United States)

    Gering, Kevin L


    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.

  19. An Applet-based Anonymous Distributed Computing System. (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael


    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  20. Final Report on the Automated Computer Science Education System. (United States)

    Danielson, R. L.; And Others

    At the University of Illinois at Urbana, a computer based curriculum called Automated Computer Science Education System (ACSES) has been developed to supplement instruction in introductory computer science courses or to assist individuals interested in acquiring a foundation in computer science through independent study. The system, which uses…

  1. Neural circuits as computational dynamical systems. (United States)

    Sussillo, David


    Many recent studies of neurons recorded from cortex reveal complex temporal dynamics. How such dynamics embody the computations that ultimately lead to behavior remains a mystery. Approaching this issue requires developing plausible hypotheses couched in terms of neural dynamics. A tool ideally suited to aid in this question is the recurrent neural network (RNN). RNNs straddle the fields of nonlinear dynamical systems and machine learning and have recently seen great advances in both theory and application. I summarize recent theoretical and technological advances and highlight an example of how RNNs helped to explain perplexing high-dimensional neurophysiological data in the prefrontal cortex.

  2. Controlling Energy Demand in Mobile Computing Systems

    CERN Document Server

    Ellis, Carla


    This lecture provides an introduction to the problem of managing the energy demand of mobile devices. Reducing energy consumption, primarily with the goal of extending the lifetime of battery-powered devices, has emerged as a fundamental challenge in mobile computing and wireless communication. The focus of this lecture is on a systems approach where software techniques exploit state-of-the-art architectural features rather than relying only upon advances in lower-power circuitry or the slow improvements in battery technology to solve the problem. Fortunately, there are many opportunities to i

  3. Current status of the TSensor systems roadmap

    NARCIS (Netherlands)

    Walsh, Steven Thomas; Bryzek, Janusz; Pisano, Albert P.


    We apply our work from the contemporary pharmaceutical industry to generate a third generation-style technology roadmap for TSensor Systems. First we identify drivers and consortia. We then identify relevant technology components, namely multiple root technologies, multiple unit cells, multiple crit

  4. Current status of the TSensor systems roadmap

    NARCIS (Netherlands)

    Walsh, Steven; Bryzek, Janusz; Pisano, Albert P.


    We apply our work from the contemporary pharmaceutical industry to generate a third generation-style technology roadmap for TSensor Systems. First we identify drivers and consortia. We then identify relevant technology components, namely multiple root technologies, multiple unit cells, multiple crit

  5. Current status of the TSensor systems roadmap

    NARCIS (Netherlands)

    Walsh, Steven Thomas; Bryzek, Janusz; Pisano, Albert P.


    We apply our work from the contemporary pharmaceutical industry to generate a third generation-style technology roadmap for TSensor Systems. First we identify drivers and consortia. We then identify relevant technology components, namely multiple root technologies, multiple unit cells, multiple

  6. Upgrading NASA/DOSE laser ranging system control computers (United States)

    Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.


    Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.

  7. Computation and brain processes, with special reference to neuroendocrine systems. (United States)

    Toni, Roberto; Spaletta, Giulia; Casa, Claudia Della; Ravera, Simone; Sandri, Giorgio


    The development of neural networks and brain automata has made neuroscientists aware that the performance limits of these brain-like devices lies, at least in part, in their computational power. The computational basis of a. standard cybernetic design, in fact, refers to that of a discrete and finite state machine or Turing Machine (TM). In contrast, it has been suggested that a number of human cerebral activites, from feedback controls up to mental processes, rely on a mixing of both finitary, digital-like and infinitary, continuous-like procedures. Therefore, the central nervous system (CNS) of man would exploit a form of computation going beyond that of a TM. This "non conventional" computation has been called hybrid computation. Some basic structures for hybrid brain computation are believed to be the brain computational maps, in which both Turing-like (digital) computation and continuous (analog) forms of calculus might occur. The cerebral cortex and brain stem appears primary candidate for this processing. However, also neuroendocrine structures like the hypothalamus are believed to exhibit hybrid computional processes, and might give rise to computational maps. Current theories on neural activity, including wiring and volume transmission, neuronal group selection and dynamic evolving models of brain automata, bring fuel to the existence of natural hybrid computation, stressing a cooperation between discrete and continuous forms of communication in the CNS. In addition, the recent advent of neuromorphic chips, like those to restore activity in damaged retina and visual cortex, suggests that assumption of a discrete-continuum polarity in designing biocompatible neural circuitries is crucial for their ensuing performance. In these bionic structures, in fact, a correspondence exists between the original anatomical architecture and synthetic wiring of the chip, resulting in a correspondence between natural and cybernetic neural activity. Thus, chip "form

  8. Uniform physical theory of diffraction equivalent edge currents for implementation in general computer codes

    DEFF Research Database (Denmark)

    Johansen, Peter Meincke


    New uniform closed-form expressions for physical theory of diffraction equivalent edge currents are derived for truncated incremental wedge strips. In contrast to previously reported expressions, the new expressions are well-behaved for all directions of incidence and observation and take a finit...... value for zero strip length. Consequently, the new equivalent edge currents are, to the knowledge of the author, the first that are well-suited for implementation in general computer codes...

  9. The Spartan attitude control system - Ground support computer (United States)

    Schnurr, R. G., Jr.


    The Spartan Attitude Control System (ACS) contains a command and control computer. This computer is optimized for the activities of the flight and contains very little human interface hardware and software. The computer system provides the technicians testing of Spartan ACS with a convenient command-oriented interface to the flight ACS computer. The system also decodes and time tags data automatically sent out by the flight computer as key events occur. The duration and magnitude of all system maneuvers is also derived and displayed by this system. The Ground Support Computer is also the primary Ground Support Equipment for the flight sequencer which controls all payload maneuvers, and long term program timing.

  10. Current status of dentin adhesive systems. (United States)

    Leinfelder, K F


    Undoubtedly, dentin bonding agents have undergone a major evolution during the last several years. The shear bond strength of composite resin to the surface of dentin is actually greater than the inherent strength of the dentin itself under well-controlled conditions. No longer must the clinician depend only upon the bonding to enamel as the sole bonding mechanism. Bonding to both types of dental structure permits even better reinforcement of the tooth itself. Perhaps even more important than the high level of bonding exhibited by the current dentin adhesives is their ability to seal the dentin. So effective is this sealing capability that it is now possible to protect the pulpal tissue from microbial invasion through the dentinal tubules. Further, by enclosing the odontoblastic processes and preventing fluid flow, the potential for postoperative sensitivity is diminished considerably. In fact, so evolutionary is the concept of bonding that the procedures associated with the restoration of teeth has changed dramatically. Undoubtedly, far greater improvements can be anticipated in the future.

  11. Development of BSCCO persistent current system

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Jin Ho; Nah, Wan Soo; Kang, Hyung Koo; Yoo, Jung Hoon [Sungkyunkwan University, Seoul (Korea)


    We have developed temperature-variable critical current measurement device for high Tc superconducting wires. For this end, vacuum shroud was designed and fabricated, and that both signal lines and power lines into the vacuum shroud were installed on it. Secondly, the design procedures for the PCS were established for the high Tc superconducting wires based on the electrical circuit analyses during energizations. We have also evaluated mechanical properties such as hardness, strength and elongation of sheath alloys made by addition of Cu, Mg, Ti, Zr and Ni to Ag matrix using induction melting furnace. It was observed that hardness and strength were improved by increasing additive contents from 0.05 to 0.2 at.%. Specifically, the increment of strength was relatively higher for alloys made by addition of Mg, Cu and Zr elements than that made by Ni and Ti addition. On the other hand, elongation was measured to be significantly reduced for former sheath alloy materials. (author). 12 refs., 13 figs., 4 tabs.

  12. NADIR: A Flexible Archiving System Current Development (United States)

    Knapic, C.; De Marco, M.; Smareglia, R.; Molinaro, M.


    The New Archiving Distributed InfrastructuRe (NADIR) is under development at the Italian center for Astronomical Archives (IA2) to increase the performances of the current archival software tools at the data center. Traditional softwares usually offer simple and robust solutions to perform data archive and distribution but are awkward to adapt and reuse in projects that have different purposes. Data evolution in terms of data model, format, publication policy, version, and meta-data content are the main threats to re-usage. NADIR, using stable and mature framework features, answers those very challenging issues. Its main characteristics are a configuration database, a multi threading and multi language environment (C++, Java, Python), special features to guarantee high scalability, modularity, robustness, error tracking, and tools to monitor with confidence the status of each project at each archiving site. In this contribution, the development of the core components is presented, commenting also on some performance and innovative features (multi-cast and publisher-subscriber paradigms). NADIR is planned to be developed as simply as possible with default configurations for every project, first of all for LBT and other IA2 projects.

  13. Engineering Control Systems and Computing in the 1990s


    Casti, J.L.


    The relationship between computing hardware/software and engineering control systems is projected into the next decade, and conjectures are made as to the areas of control and system theory that will most benefit from various types of computing advances.

  14. Computer Based Information Systems and the Middle Manager. (United States)

    Why do some computer based information systems succeed while others fail. It concludes with eleven recommended areas that middle management must...understand in order to effectively use computer based information systems . (Modified author abstract)


    Directory of Open Access Journals (Sweden)



    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  16. Computational System For Rapid CFD Analysis In Engineering (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.


    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  17. 3D computation of non-linear eddy currents: Variational method and superconducting cubic bulk (United States)

    Pardo, Enric; Kapolka, Milan


    Computing the electric eddy currents in non-linear materials, such as superconductors, is not straightforward. The design of superconducting magnets and power applications needs electromagnetic computer modeling, being in many cases a three-dimensional (3D) problem. Since 3D problems require high computing times, novel time-efficient modeling tools are highly desirable. This article presents a novel computing modeling method based on a variational principle. The self-programmed implementation uses an original minimization method, which divides the sample into sectors. This speeds-up the computations with no loss of accuracy, while enabling efficient parallelization. This method could also be applied to model transients in linear materials or networks of non-linear electrical elements. As example, we analyze the magnetization currents of a cubic superconductor. This 3D situation remains unknown, in spite of the fact that it is often met in material characterization and bulk applications. We found that below the penetration field and in part of the sample, current flux lines are not rectangular and significantly bend in the direction parallel to the applied field. In conclusion, the presented numerical method is able to time-efficiently solve fully 3D situations without loss of accuracy.

  18. DNA-enabled integrated molecular systems for computation and sensing. (United States)

    LaBoda, Craig; Duschl, Heather; Dwyer, Chris L


    CONSPECTUS: Nucleic acids have become powerful building blocks for creating supramolecular nanostructures with a variety of new and interesting behaviors. The predictable and guided folding of DNA, inspired by nature, allows designs to manipulate molecular-scale processes unlike any other material system. Thus, DNA can be co-opted for engineered and purposeful ends. This Account details a small portion of what can be engineered using DNA within the context of computer architectures and systems. Over a decade of work at the intersection of DNA nanotechnology and computer system design has shown several key elements and properties of how to harness the massive parallelism created by DNA self-assembly. This work is presented, naturally, from the bottom-up beginning with early work on strand sequence design for deterministic, finite DNA nanostructure synthesis. The key features of DNA nanostructures are explored, including how the use of small DNA motifs assembled in a hierarchical manner enables full-addressability of the final nanostructure, an important property for building dense and complicated systems. A full computer system also requires devices that are compatible with DNA self-assembly and cooperate at a higher level as circuits patterned over many, many replicated units. Described here is some work in this area investigating nanowire and nanoparticle devices, as well as chromophore-based circuits called resonance energy transfer (RET) logic. The former is an example of a new way to bring traditional silicon transistor technology to the nanoscale, which is increasingly problematic with current fabrication methods. RET logic, on the other hand, introduces a framework for optical computing at the molecular level. This Account also highlights several architectural system studies that demonstrate that even with low-level devices that are inferior to their silicon counterparts and a substrate that harbors abundant defects, self-assembled systems can still

  19. Feasibility Implementation of Voltage-Current Waveform Telemetry System in Power Delivery System (United States)

    Furukawa, Tatsuya; Akagi, Keita; Fukumoto, Hisao; Itoh, Hideaki; Wakuya, Hiroshi; Hirata, Kenji; Ohchi, Masashi

    The electric power is indispensable for modern life. However, there is a problem of harmonic disturbance when the harmonic power runs into electronic devices. To overcome the problem and realize a stable supply of the electric power is an important issue. In this study, we have developed a voltage-current waveform telemetry system for the remote measurement of the harmonics in the power delivery lines. The system consists of sensors, preamplifiers, a single board computer, and power collectors. Improvements are made on all of these components except the sensors. The power collector is a coil that can be placed around the same power line that we measure. We have designed the power collector by a finite element method(FEM) so that it can provide enough electricity for the computer to work properly. Thus, no other power source such as a battery except the secondary rechargeable battery for the recovery is necessary at the measurement place. The preamplifier in the new system is a single-supply differential amplifier circuit, and the single board computer has an inexpensive SH-3 CPU. Through experiments, we have confirmed that the power collector can provide sufficient electricity and that the new system can successfully measure the waveforms and the harmonics in power delivery systems.

  20. Computation of magnetic fields within source regions of ionospheric and magnetospheric currents

    DEFF Research Database (Denmark)

    Engels, U.; Olsen, Nils


    A general method of computing the magnetic effect caused by a predetermined three-dimensional external current density is presented. It takes advantage of the representation of solenoidal vector fields in terms of toroidal and poloidal modes expressed by two independent series of spherical harmon...

  1. Electrical safety in spinal cord stimulation: current density analysis by computer modeling

    NARCIS (Netherlands)

    Wesselink, W.A.; Holsheimer, J.


    The possibility of tissue damage in spinal cord stimulation was investigated in a computer modeling study. A decrease of the electrode area in monopolar stimulation resulted in an increase of the current density at the electrode surface. When comparing the modeling results with experimental data

  2. Harmonic Analysis of Currents and Voltages Obtained in the Result of Computational Experiment

    Directory of Open Access Journals (Sweden)

    I. V. Novash


    Full Text Available The paper considers a methodology for execution of a harmonic analysis of current and voltage numerical values obtained in the result of a computational experiment and saved in an external data file. The harmonic analysis has been carried out in the Mathcad mathematical packet environment.

  3. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study


    Ellis, Mary Kay


    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...


    Directory of Open Access Journals (Sweden)

    N. A. Gorban


    Full Text Available The authors provide the proceedings of the 2005 First International Society of Urological Pathology Consensus Conference and the basic provisions that differ the modified Gleason grading system from its original interpretation. In particular, we should do away with Gleason grade 1 (or 1 + 1 = 2 while assessing the needle biopsy specimens. Contrary to the recommendations by Gleason himself, the conference decided to apply stringent criteria for using Gleason grades 3 and 4. This is due to the fact that these grades are of special prognostic value so it is important to have clear criteria in defining each Gleason grade. Notions, such as secondary and tertiary Gleason patterns, are considered; detailed recommendations are given on the lesion extent sufficient to diagnose these components.

  5. Multiaxis, Lightweight, Computer-Controlled Exercise System (United States)

    Haynes, Leonard; Bachrach, Benjamin; Harvey, William


    The multipurpose, multiaxial, isokinetic dynamometer (MMID) is a computer-controlled system of exercise machinery that can serve as a means for quantitatively assessing a subject s muscle coordination, range of motion, strength, and overall physical condition with respect to a wide variety of forces, motions, and exercise regimens. The MMID is easily reconfigurable and compactly stowable and, in comparison with prior computer-controlled exercise systems, it weighs less, costs less, and offers more capabilities. Whereas a typical prior isokinetic exercise machine is limited to operation in only one plane, the MMID can operate along any path. In addition, the MMID is not limited to the isokinetic (constant-speed) mode of operation. The MMID provides for control and/or measurement of position, force, and/or speed of exertion in as many as six degrees of freedom simultaneously; hence, it can accommodate more complex, more nearly natural combinations of motions and, in so doing, offers greater capabilities for physical conditioning and evaluation. The MMID (see figure) includes as many as eight active modules, each of which can be anchored to a floor, wall, ceiling, or other fixed object. A cable is payed out from a reel in each module to a bar or other suitable object that is gripped and manipulated by the subject. The reel is driven by a DC brushless motor or other suitable electric motor via a gear reduction unit. The motor can be made to function as either a driver or an electromagnetic brake, depending on the required nature of the interaction with the subject. The module includes a force and a displacement sensor for real-time monitoring of the tension in and displacement of the cable, respectively. In response to commands from a control computer, the motor can be operated to generate a required tension in the cable, to displace the cable a required distance, or to reel the cable in or out at a required speed. The computer can be programmed, either locally or via

  6. 14 CFR 415.123 - Computing systems and software. (United States)


    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  7. Robot-assisted and computer-enhanced therapies for children with cerebral palsy: current state and clinical implementation. (United States)

    Meyer-Heim, Andreas; van Hedel, Hubertus J A


    The field of pediatric neurorehabilitation has rapidly evolved with the introduction of technological advancements over recent years. Rehabilitation robotics and computer-assisted systems can complement conventional physiotherapeutics or occupational therapies. These systems appear promising, especially in children, where exciting and challenging virtual reality scenarios could increase motivation to train intensely in a playful therapeutic environment. Despite promising experience and a large acceptance by the patients and parents, so far, only a few therapy systems have been evaluated in children, and well-designed randomized controlled studies in this field are still lacking. This narrative review aims to provide an overview about the to-date robot-assisted and computer-based therapies and the current level of evidence and to share the authors experience about the clinical implication of these new technologies available for children with cerebral palsy.

  8. A superconducting transformer system for high current cable testing. (United States)

    Godeke, A; Dietderich, D R; Joseph, J M; Lizarazo, J; Prestemon, S O; Miller, G; Weijers, H W


    This article describes the development of a direct-current (dc) superconducting transformer system for the high current test of superconducting cables. The transformer consists of a core-free 10,464 turn primary solenoid which is enclosed by a 6.5 turn secondary. The transformer is designed to deliver a 50 kA dc secondary current at a dc primary current of about 50 A. The secondary current is measured inductively using two toroidal-wound Rogowski coils. The Rogowski coil signal is digitally integrated, resulting in a voltage signal that is proportional to the secondary current. This voltage signal is used to control the secondary current using a feedback loop which automatically compensates for resistive losses in the splices to the superconducting cable samples that are connected to the secondary. The system has been commissioned up to 28 kA secondary current. The reproducibility in the secondary current measurement is better than 0.05% for the relevant current range up to 25 kA. The drift in the secondary current, which results from drift in the digital integrator, is estimated to be below 0.5 A/min. The system's performance is further demonstrated through a voltage-current measurement on a superconducting cable sample at 11 T background magnetic field. The superconducting transformer system enables fast, high resolution, economic, and safe tests of the critical current of superconducting cable samples.

  9. Comparative radiopacity of six current adhesive systems. (United States)

    de Moraes Porto, Isabel Cristina Celerino; Honório, Naira Cândido; Amorim, Dayse Annie Nicácio; de Melo Franco, Aurea Valéria; Penteado, Luiz Alexandre Moura; Parolia, Abhishek


    The radiopacity of contemporary adhesive systems has been mentioned as the indication for replacement of restorations due to misinterpretation of radiographic images. This study aimed to evaluate the radiopacity of contemporary bonding agents and to compare their radiodensities with those of enamel and dentin. To measure the radiopacity, eight specimens were fabricated from Clearfil SE Bond (CF), Xeno V (XE), Adper SE Bond (ASE), Magic Bond (MB), Single Bond 2 (SB), Scotchbond Multipurpose (SM), and gutta-percha (positive control). The optical densities of enamel, dentin, the bonding agents, gutta-percha, and an aluminium (Al) step wedge were obtained from radiographic images using image analysis software. The radiographic density data were analyzed statistically by analysis of variance and Tukey's test (α =0.05). Significant differences were found between ASE and all other groups tested and between XE and CF. No statistical difference was observed between the radiodensity of 1 mm of Al and 1 mm of dentin, between 2 mm of Al and enamel, and between 5 mm of Al and gutta-percha. Five of the six adhesive resins had radiopacity values that fell below the value for dentin, whereas the radiopacity of ASE adhesive was greater than that of dentin but below that of enamel. This investigation demonstrates that only ASE presented a radiopacity within the values of dentin and enamel. CF, XE, MB, SB, and SM adhesives are all radiolucent and require alterations to their composition to facilitate their detection by means of radiographic images.

  10. Intelligent Computer Vision System for Automated Classification (United States)

    Jordanov, Ivan; Georgieva, Antoniya


    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  11. Computational dynamics of acoustically driven microsphere systems. (United States)

    Glosser, Connor; Piermarocchi, Carlo; Li, Jie; Dault, Dan; Shanker, B


    We propose a computational framework for the self-consistent dynamics of a microsphere system driven by a pulsed acoustic field in an ideal fluid. Our framework combines a molecular dynamics integrator describing the dynamics of the microsphere system with a time-dependent integral equation solver for the acoustic field that makes use of fields represented as surface expansions in spherical harmonic basis functions. The presented approach allows us to describe the interparticle interaction induced by the field as well as the dynamics of trapping in counter-propagating acoustic pulses. The integral equation formulation leads to equations of motion for the microspheres describing the effect of nondissipative drag forces. We show (1) that the field-induced interactions between the microspheres give rise to effective dipolar interactions, with effective dipoles defined by their velocities and (2) that the dominant effect of an ultrasound pulse through a cloud of microspheres gives rise mainly to a translation of the system, though we also observe both expansion and contraction of the cloud determined by the initial system geometry.

  12. Beam Current Measurement and Adjustment System on AMS

    Institute of Scientific and Technical Information of China (English)

    WUShao-yong; HEMING; SUSheng-yong; WANGZhen-jun; JIANGShan


    The beam current measurement and adjustment system of HI-13 tandem accelerator mass spectrometry detector system is consisted of the faraday cup, fluorescent target and a series of adjustable vertical slits(Fig. 1). The system's operation is very complicated and the transmission is low for the old system. A new system is instalated for improvement. We put the adjustable vertical slit, Faraday cup.

  13. Digital data management for CAD/CAM technology. An update of current systems. (United States)

    Andreiotelli, M; Kamposiora, P; Papavasiliou, G


    Abstract - Computer-aided design/computer-aided manufacturing (CAD/CAM) technology continues to rapidly evolve in the dental community. This review article provides an overview of the operational components and methodologies used with some of the CAD/CAM systems. Future trends are also discussed. While these systems show great promise, the quality of performance varies among systems. No single system currently acquires data directly in the oral cavity and produces restorations using all materials available. Further refinements of these CAD/CAM technologies may increase their capabilities, but further special training will be required for effective use.

  14. Computing the Moore-Penrose Inverse of a Matrix with a Computer Algebra System (United States)

    Schmidt, Karsten


    In this paper "Derive" functions are provided for the computation of the Moore-Penrose inverse of a matrix, as well as for solving systems of linear equations by means of the Moore-Penrose inverse. Making it possible to compute the Moore-Penrose inverse easily with one of the most commonly used Computer Algebra Systems--and to have the blueprint…

  15. Taper Preparation Variability Compared to Current Taper Standards Using Computed Tomography

    Directory of Open Access Journals (Sweden)

    Richard Gergi


    Full Text Available Introduction. The purpose of this study was to compare the taper variation in root canal preparations among Twisted Files and PathFiles-ProTaper .08 tapered rotary files to current standards. Methods. 60 root canals with severe angle of curvature (between 25∘ and 35∘ and short radius (<10 mm were selected. The canals were divided randomly into two groups of 30 each. After preparation with Twisted Files and PathFiles-ProTaper to size 25 taper .08, the diameter was measured using computed tomography (CT at 1, 3, and 16 mm. Canal taper preparation was calculated at the apical third and at the middle-cervical third. Results. Of the 2 file systems, both fell within the ±.05 taper variability. All preparations demonstrated variability when compared to the nominal taper .08. In the apical third, mean taper was significantly different between TF and PathFiles-ProTaper ( value < 0.0001; independent -test. Mean Taper was significantly higher with PathFile-ProTaper. In the middle-cervical third, mean Taper was significantly higher with TF ( value = 0.015; independent -test. Conclusion. Taper preparations of the investigated size 25 taper .08 were favorable but different from the nominal taper.

  16. Design of BEPC Ⅱ bunch current monitor system

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lei; MA Hui-Zhou; YUE Jun-Hui; LEI Ge; CAO Jian-She; MA Li


    BEPC Ⅱ is an electron-positron collider designed to run under multi-bunches and high beam current condition. The accelerator consists of an electron ring, a positron ring and a linear injector. In order to achieve the target luminosity and implement the equal bunch charge injection, the Bunch Current Monitor (BCM)system is built on BEPC Ⅱ. The BCM system consists of three parts: the front-end circuit, the bunch current acquisition system and the bucket selection system. The control software of BCM is based on VxWorks and EPICS. With the help of BCM system, the bunch current in each bucket can be monitored in the Central Control Room. The BEPC Ⅱ timing system can also use the bunch current database to decide which bucket needs to refill to implement "top-off" injection.

  17. Computation of reduced energy input current stimuli for neuron phase models. (United States)

    Anyalebechi, Jason; Koelling, Melinda E; Miller, Damon A


    A regularly spiking neuron can be studied using a phase model. The effect of an input stimulus current on the phase time derivative is captured by a phase response curve. This paper adapts a technique that was previously applied to conductance-based models to discover optimal input stimulus currents for phase models. First, the neuron phase response θ(t) due to an input stimulus current i(t) is computed using a phase model. The resulting θ(t) is taken to be a reference phase r(t). Second, an optimal input stimulus current i(*)(t) is computed to minimize a weighted sum of the square-integral `energy' of i(*)(t) and the tracking error between the reference phase r(t) and the phase response due to i(*)(t). The balance between the conflicting requirements of energy and tracking error minimization is controlled by a single parameter. The generated optimal current i(*)t) is then compared to the input current i(t) which was used to generate the reference phase r(t). This technique was applied to two neuron phase models; in each case, the current i(*)(t) generates a phase response similar to the reference phase r(t), and the optimal current i(*)(t) has a lower `energy' than the square-integral of i(t). For constant i(t), the optimal current i(*)(t) need not be constant in time. In fact, i(*)(t) is large (possibly even larger than i(t)) for regions where the phase response curve indicates a stronger sensitivity to the input stimulus current, and smaller in regions of reduced sensitivity.

  18. Current opinion on computer-aided surgical navigation and robotics: role in the treatment of sports-related injuries. (United States)

    Musahl, Volker; Plakseychuk, Anton; Fu, Freddie H


    Computer-assisted surgery (CAS) may allow surgeons to be more precise and minimally invasive, in addition to being an excellent research tool. Medical imaging, such as magnetic resonance and computed tomography is not only an important diagnostic tool, but also a necessary planning tool. In orthopaedic sports medicine, precision is needed when placing tunnels for soft tissue fixation of replacement grafts. Two types of CAS systems -- passive and active -- have been developed. Passive systems, or surgical navigation systems, provide the surgeon with additional information prior to and during the surgical procedure (in real time). Active systems have the ability of performing certain surgical steps autonomously. Both active and passive CAS systems are currently a subject of basic science and clinical investigations and will be discussed and commented on in this article. In summary, passive navigation systems can provide additional information to the surgeon and can therefore lead to more precise tunnel placement. Active robotic technology seems to be accurate and feasible with promising initial results from Europe. However, active and passive CAS can only be as precise as the surgeon who plans the procedure. Therefore, future studies have to focus on integrating, arthroscopy, 3-D image-enhanced computer navigation, and virtual kinematics, as well as to increase precision in surgical techniques.

  19. A City Parking Integration System Combined with Cloud Computing Technologies and Smart Mobile Devices (United States)

    Yeh, Her-Tyan; Chen, Bing-Chang; Wang, Bo-Xun


    The current study applied cloud computing technology and smart mobile devices combined with a streaming server for parking lots to plan a city parking integration system. It is also equipped with a parking search system, parking navigation system, parking reservation service, and car retrieval service. With this system, users can quickly find…

  20. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z


    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  1. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server


    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  2. Computational intelligence approaches for pattern discovery in biological systems. (United States)

    Fogel, Gary B


    Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.

  3. A computing system for LBB considerations

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, K.; Miettinen, J.; Raiko, H.; Keskinen, R.


    A computing system has been developed at VTT Energy for making efficient leak-before-break (LBB) evaluations of piping components. The system consists of fracture mechanics and leak rate analysis modules which are linked via an interactive user interface LBBCAL. The system enables quick tentative analysis of standard geometric and loading situations by means of fracture mechanics estimation schemes such as the R6, FAD, EPRI J, Battelle, plastic limit load and moments methods. Complex situations are handled with a separate in-house made finite-element code EPFM3D which uses 20-noded isoparametric solid elements, automatic mesh generators and advanced color graphics. Analytical formulas and numerical procedures are available for leak area evaluation. A novel contribution for leak rate analysis is the CRAFLO code which is based on a nonequilibrium two-phase flow model with phase slip. Its predictions are essentially comparable with those of the well known SQUIRT2 code; additionally it provides outputs for temperature, pressure and velocity distributions in the crack depth direction. An illustrative application to a circumferentially cracked elbow indicates expectedly that a small margin relative to the saturation temperature of the coolant reduces the leak rate and is likely to influence the LBB implementation to intermediate diameter (300 mm) primary circuit piping of BWR plants.

  4. Computer vision for driver assistance systems (United States)

    Handmann, Uwe; Kalinke, Thomas; Tzomakas, Christos; Werner, Martin; von Seelen, Werner


    Systems for automated image analysis are useful for a variety of tasks and their importance is still increasing due to technological advances and an increase of social acceptance. Especially in the field of driver assistance systems the progress in science has reached a level of high performance. Fully or partly autonomously guided vehicles, particularly for road-based traffic, pose high demands on the development of reliable algorithms due to the conditions imposed by natural environments. At the Institut fur Neuroinformatik, methods for analyzing driving relevant scenes by computer vision are developed in cooperation with several partners from the automobile industry. We introduce a system which extracts the important information from an image taken by a CCD camera installed at the rear view mirror in a car. The approach consists of a sequential and a parallel sensor and information processing. Three main tasks namely the initial segmentation (object detection), the object tracking and the object classification are realized by integration in the sequential branch and by fusion in the parallel branch. The main gain of this approach is given by the integrative coupling of different algorithms providing partly redundant information.

  5. Advances in Future Computer and Control Systems v.2

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)


    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  6. Advances in Future Computer and Control Systems v.1

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)


    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  7. Reachability computation for hybrid systems with Ariadne

    NARCIS (Netherlands)

    L. Benvenuti; D. Bresolin; A. Casagrande; P.J. Collins (Pieter); A. Ferrari; E. Mazzi; T. Villa; A. Sangiovanni-Vincentelli


    htmlabstractAriadne is an in-progress open environment to design algorithms for computing with hybrid automata, that relies on a rigorous computable analysis theory to represent geometric objects, in order to achieve provable approximation bounds along the computations. In this paper we discuss the

  8. Medical computing in the 1980s. Operating system and programming language issues. (United States)

    Greenes, R A


    Operating systems and programming languages differ widely in their suitability for particular applications. The diversity of medical computing needs demands a diversity of solutions. Compounding this diversity if the decentralization caused by evolution of local computing systems for local needs. Relevant current trends in computing include increased emphasis on decentralization, growing capabilities for interconnection of diverse systems, and development of common data base and file server capabilities. In addition, standardization and hardware in dependence of operating systems, as well as programming languages and development of programmerless systems, continue to widen application opportunities.

  9. Coronary computed tomography angiography: overview of technical aspects, current concepts, and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Chartrand-Lefebvre, C.; Cadrin-Chenevert, A. [Univ. of Montreal Medical Centre, Radiology Dept., Montreal, Quebec (Canada)]. E-mail:; Bordeleau, E. [Univ. of Montreal Medical Centre, Radiology Dept., Montreal, Quebec (Canada); Hopital Laval, St. Foy, Quebec (Canada); Ugolini, P.; Ouellet, R. [Montreal Inst. of Cardiology, Montreal, Quebec (Canada); Sablayrolles, J.-L. [Centre Cardiologique du Nord, Paris (France); Prenovault, J. [Univ. of Montreal Medical Centre, Radiology Dept., Montreal, Quebec (Canada)


    Multidetector-row electrocardiogram (ECC)-gated cardiac computed tomography (CT) will probably be a major noninvasive imaging option in the near future. Recent developments indicate that this new technology is improving rapidly. This article presents an overview of the current concepts, perspectives, and technical capabilities in coronary CT angiography (CTA). We have reviewed the recent literature on the different applications of this technology; of particular note are the many studies that have demonstrated the high negative predictive value (NPV) of coronary CTA, when performed under optimal conditions, for significant stenoses in native coronary arteries. This new technology's level of performance allows it to be used to evaluate the presence of calcified plaques, coronary bypass graft patency, and the origin and course of congenital coronary anomalies. Despite a high NPV, the robustness of the technology is limited by arrhythmias, the requirement of low heart rates, and calcium-related artifacts. Sonic improvements are needed in the imaging of coronary stents, especially the smaller stents, and in the detection and characterization of noncalcified plaques. Further studies are needed to more precisely determine the role of CTA in various symptomatic and asymptomatic patient groups. Clinical testing of 64-slice scanners has recently begun. As the technology improves, so does the spatial and temporal resolution. To date, this is being achieved through the development of systems with an increased number of detectors and shorter gantry rotation time as well as the development of systems equipped with 2 X-ray tubes and the eventual development of flat-panel technology. Thus further improvement of image quality is expected. (author)

  10. 2002 Computing and Interdisciplinary Systems Office Review and Planning Meeting (United States)

    Lytle, John; Follen, Gregory; Lopez, Isaac; Veres, Joseph; Lavelle, Thomas; Sehra, Arun; Freeh, Josh; Hah, Chunill


    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with NASA Glenn s Propulsion program, NASA Ames, industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This year s review meeting describes the current status of the NPSS and the Object Oriented Development Kit with specific emphasis on the progress made over the past year on air breathing propulsion applications for aeronautics and space transportation applications. Major accomplishments include the first 3-D simulation of the primary flow path of a large turbofan engine in less than 15 hours, and the formal release of the NPSS Version 1.5 that includes elements of rocket engine systems and a visual based syntax layer. NPSS and the Development Kit are managed by the Computing and Interdisciplinary Systems Office (CISO) at the NASA Glenn Research Center and financially supported in fiscal year 2002 by the Computing, Networking and Information Systems (CNIS) project managed at NASA Ames, the Glenn Aerospace Propulsion and Power Program and the Advanced Space Transportation Program.

  11. A data management system for engineering and scientific computing (United States)

    Elliot, L.; Kunii, H. S.; Browne, J. C.


    Data elements and relationship definition capabilities for this data management system are explicitly tailored to the needs of engineering and scientific computing. System design was based upon studies of data management problems currently being handled through explicit programming. The system-defined data element types include real scalar numbers, vectors, arrays and special classes of arrays such as sparse arrays and triangular arrays. The data model is hierarchical (tree structured). Multiple views of data are provided at two levels. Subschemas provide multiple structural views of the total data base and multiple mappings for individual record types are supported through the use of a REDEFINES capability. The data definition language and the data manipulation language are designed as extensions to FORTRAN. Examples of the coding of real problems taken from existing practice in the data definition language and the data manipulation language are given.

  12. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov


    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  13. Genost: A System for Introductory Computer Science Education with a Focus on Computational Thinking (United States)

    Walliman, Garret

    Computational thinking, the creative thought process behind algorithmic design and programming, is a crucial introductory skill for both computer scientists and the population in general. In this thesis I perform an investigation into introductory computer science education in the United States and find that computational thinking is not effectively taught at either the high school or the college level. To remedy this, I present a new educational system intended to teach computational thinking called Genost. Genost consists of a software tool and a curriculum based on teaching computational thinking through fundamental programming structures and algorithm design. Genost's software design is informed by a review of eight major computer science educational software systems. Genost's curriculum is informed by a review of major literature on computational thinking. In two educational tests of Genost utilizing both college and high school students, Genost was shown to significantly increase computational thinking ability with a large effect size.

  14. A computer control system using a virtual keyboard (United States)

    Ejbali, Ridha; Zaied, Mourad; Ben Amar, Chokri


    This work is in the field of human-computer communication, namely in the field of gestural communication. The objective was to develop a system for gesture recognition. This system will be used to control a computer without a keyboard. The idea consists in using a visual panel printed on an ordinary paper to communicate with a computer.

  15. 10 CFR 35.657 - Therapy-related computer systems. (United States)


    ... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.657 Section 35.657... Units, Teletherapy Units, and Gamma Stereotactic Radiosurgery Units § 35.657 Therapy-related computer... computer systems in accordance with published protocols accepted by nationally recognized bodies. At...

  16. 17 CFR 270.2a-4 - Definition of “current net asset value” for use in computing periodically the current price of... (United States)


    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Definition of âcurrent net asset valueâ for use in computing periodically the current price of redeemable security. 270.2a-4... AND REGULATIONS, INVESTMENT COMPANY ACT OF 1940 § 270.2a-4 Definition of “current net asset value” for...

  17. Factory automation management computer system and its applications. FA kanri computer system no tekiyo jirei

    Energy Technology Data Exchange (ETDEWEB)

    Maeda, M. (Meidensha Corp., Tokyo (Japan))


    A plurality of NC composite lathes used in a breaker manufacturing and processing line were integrated under a system mainly comprising the industrial computer [mu] PORT, an exclusive LAN, and material handling robots. This paper describes this flexible manufacturing system (FMS) that operates on an unmanned basis from process control to material distribution and processing. This system has achieved the following results: efficiency improvement in lines producing a great variety of products in small quantities and in mixed flow production lines enhancement in facility operating rates by means of group management of NC machine tools; orientation to developing into integrated production systems; expansion of processing capacity; reduction in number of processes; and reduction in management and indirect manpowers. This system allocates the production control plans transmitted from the production control system operated by a host computer to the processes on a daily basis and by machines, using the [mu] PORT. This FMS utilizes features of the multi-task processing function of the [mu] PORT and the ultra high-speed real-time-based BASIC. The system processes simultaneously the process management such as machining programs and processing results, the processing data management, and the operation control of a plurality of machines. The system achieved systematized machining processes. 6 figs., 2 tabs.

  18. Proceedings: Workshop on Advanced Mathematics and Computer Science for Power Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)



    EPRI's Office of Exploratory Research sponsors a series of workshops that explore how to apply recent advances in mathematics and computer science to the problems of the electric utility industry. In this workshop, participants identified research objectives that may significantly improve the mathematical methods and computer architecture currently used for power system analysis.

  19. Distributed computing system with dual independent communications paths between computers and employing split tokens (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)


    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  20. New computing systems, future computing environment, and their implications on structural analysis and design (United States)

    Noor, Ahmed K.; Housner, Jerrold M.


    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  1. Evaluation of automatic exposure control systems in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Reina, Thamiris Rosado


    The development of the computed tomography (CT) technology has brought wider possibilities on diagnostic medicine. It is a non-invasive method to see the human body in details. As the CT application increases, it raises the concern about patient dose, because the higher dose levels imparted compared to other diagnostic imaging modalities. The radiology community (radiologists, medical physicists and manufacturer) are working together to find the lowest dose level possible, without compromising the diagnostic image quality. The greatest and relatively new advance to lower the patient dose is the automatic exposure control (AEC) systems in CT. These systems are designed to ponder the dose distribution along the patient scanning and between patients taking into account their sizes and irradiated tissue densities. Based on the CT scanning geometry, the AEC-systems are very complex and their functioning is yet not fully understood. This work aims to evaluate the clinical performance of AEC-systems and their susceptibilities to assist on possible patient dose optimizations. The approach to evaluate the AEC-systems of three of the leading CT manufacturers in Brazil, General Electric, Philips and Toshiba, was the extraction of tube current modulation data from the DICOM standard image sequences, measurement and analysis of the image noise of those image sequences and measurement of the dose distribution along the scan length on the surface and inside of two different phantoms configurations. The tube current modulation of each CT scanner associated to the resulted image quality provides the performance of the AECsystem. The dose distribution measurements provide the dose profile due to the tube current modulation. Dose measurements with the AEC-system ON and OFF were made to quantify the impact of these systems regarding patient dose. The results attained give rise to optimizations on the AEC-systems applications and, by consequence, decreases the patient dose without

  2. User-Computer Interface Technology: An Assessment of the Current State of the Art (United States)


    a model of the current state of the world. A similar effort is underway at the Computing Research Laboratory at New Mexico S-’ State University. One...process by validating or modifying requirements. Rapid prototyping techniques * such as storyboarding offer relatively low cost approaches to bringing the...Research Laboratory, New Mexico State University. Bannon, Liam J. (1986). Issues in design: some notes. In D. A. Norman and S. W. Draper (Eds.), User

  3. Applications of membrane computing in systems and synthetic biology

    CERN Document Server

    Gheorghe, Marian; Pérez-Jiménez, Mario


    Membrane Computing was introduced as a computational paradigm in Natural Computing. The models introduced, called Membrane (or P) Systems, provide a coherent platform to describe and study living cells as computational systems. Membrane Systems have been investigated for their computational aspects and employed to model problems in other fields, like: Computer Science, Linguistics, Biology, Economy, Computer Graphics, Robotics, etc. Their inherent parallelism, heterogeneity and intrinsic versatility allow them to model a broad range of processes and phenomena, being also an efficient means to solve and analyze problems in a novel way. Membrane Computing has been used to model biological systems, becoming with time a thorough modeling paradigm comparable, in its modeling and predicting capabilities, to more established models in this area. This book is the result of the need to collect, in an organic way, different facets of this paradigm. The chapters of this book, together with the web pages accompanying th...

  4. Brake Performance Analysis of ABS for Eddy Current and Electrohydraulic Hybrid Brake System

    Directory of Open Access Journals (Sweden)

    Ren He


    Full Text Available This paper introduces an eddy current and electro-hydraulic hybrid brake system to solve problems such as wear, thermal failure, and slow response of traditional vehicle brake system. Mathematical model was built to calculate the torque of the eddy current brake system and hydraulic brake system and analyze the braking force distribution between two types of brake systems. A fuzzy controller on personal computer based on LabVIEW and Matlab was designed and a set of hardware in the loop system was constructed to validate and analyze the performance of the hybrid brake system. Through lots of experiments on dry and wet asphalt roads, the hybrid brake system achieves perfect performance on the experimental bench, the hybrid system reduces abrasion and temperature of the brake disk, response speed is enhanced obviously, fuzzy controller keeps high utilization coefficient due to the optimal slip ratio regulation, and the total brake time has a smaller decrease than traditional hydraulic brake system.


    Directory of Open Access Journals (Sweden)



    Full Text Available The article presents a computer algorithm to calculate the estimated operating cost analysis rail bus. This computer application system compares the cost of employment locomotive and wagon, the cost of using locomotives and cost of using rail bus. An intensive growth of passenger railway traffic increased a demand for modern computer systems to management means of transportation. Described computer application operates on the basis of selected operating parameters of rail buses.

  6. Computers as Components Principles of Embedded Computing System Design

    CERN Document Server

    Wolf, Wayne


    This book was the first to bring essential knowledge on embedded systems technology and techniques under a single cover. This second edition has been updated to the state-of-the-art by reworking and expanding performance analysis with more examples and exercises, and coverage of electronic systems now focuses on the latest applications. Researchers, students, and savvy professionals schooled in hardware or software design, will value Wayne Wolf's integrated engineering design approach.The second edition gives a more comprehensive view of multiprocessors including VLIW and superscalar archite

  7. Using computer graphics to enhance astronaut and systems safety (United States)

    Brown, J. W.


    Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.

  8. An operating system for future aerospace vehicle computer systems (United States)

    Foudriat, E. C.; Berman, W. J.; Will, R. W.; Bynum, W. L.


    The requirements for future aerospace vehicle computer operating systems are examined in this paper. The computer architecture is assumed to be distributed with a local area network connecting the nodes. Each node is assumed to provide a specific functionality. The network provides for communication so that the overall tasks of the vehicle are accomplished. The O/S structure is based upon the concept of objects. The mechanisms for integrating node unique objects with node common objects in order to implement both the autonomy and the cooperation between nodes is developed. The requirements for time critical performance and reliability and recovery are discussed. Time critical performance impacts all parts of the distributed operating system; e.g., its structure, the functional design of its objects, the language structure, etc. Throughout the paper the tradeoffs - concurrency, language structure, object recovery, binding, file structure, communication protocol, programmer freedom, etc. - are considered to arrive at a feasible, maximum performance design. Reliability of the network system is considered. A parallel multipath bus structure is proposed for the control of delivery time for time critical messages. The architecture also supports immediate recovery for the time critical message system after a communication failure.

  9. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming (United States)

    Philip A. Araman


    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  10. Current development of UAV sense and avoid system (United States)

    Zhahir, A.; Razali, A.; Mohd Ajir, M. R.


    As unmanned aerial vehicles (UAVs) are now gaining high interests from civil and commercialised market, the automatic sense and avoid (SAA) system is currently one of the essential features in research spotlight of UAV. Several sensor types employed in current SAA research and technology of sensor fusion that offers a great opportunity in improving detection and tracking system are presented here. The purpose of this paper is to provide an overview of SAA system development in general, as well as the current challenges facing UAV researchers and designers.

  11. Lightness computation by the human visual system (United States)

    Rudd, Michael E.


    A model of achromatic color computation by the human visual system is presented, which is shown to account in an exact quantitative way for a large body of appearance matching data collected with simple visual displays. The model equations are closely related to those of the original Retinex model of Land and McCann. However, the present model differs in important ways from Land and McCann's theory in that it invokes additional biological and perceptual mechanisms, including contrast gain control, different inherent neural gains for incremental, and decremental luminance steps, and two types of top-down influence on the perceptual weights applied to local luminance steps in the display: edge classification and spatial integration attentional windowing. Arguments are presented to support the claim that these various visual processes must be instantiated by a particular underlying neural architecture. By pointing to correspondences between the architecture of the model and findings from visual neurophysiology, this paper suggests that edge classification involves a top-down gating of neural edge responses in early visual cortex (cortical areas V1 and/or V2) while spatial integration windowing occurs in cortical area V4 or beyond.

  12. Quantum Computing in Fock Space Systems (United States)

    Berezin, Alexander A.


    Fock space system (FSS) has unfixed number (N) of particles and/or degrees of freedom. In quantum computing (QC) main requirement is sustainability of coherent Q-superpositions. This normally favoured by low noise environment. High excitation/high temperature (T) limit is hence discarded as unfeasible for QC. Conversely, if N is itself a quantized variable, the dimensionality of Hilbert basis for qubits may increase faster (say, N-exponentially) than thermal noise (likely, in powers of N and T). Hence coherency may win over T-randomization. For this type of QC speed (S) of factorization of long integers (with D digits) may increase with D (for 'ordinary' QC speed polynomially decreases with D). This (apparent) paradox rests on non-monotonic bijectivity (cf. Georg Cantor's diagonal counting of rational numbers). This brings entire aleph-null structurality ("Babylonian Library" of infinite informational content of integer field) to superposition determining state of quantum analogue of Turing machine head. Structure of integer infinititude (e.g. distribution of primes) results in direct "Platonic pressure" resembling semi-virtual Casimir efect (presure of cut-off vibrational modes). This "effect", the embodiment of Pythagorean "Number is everything", renders Godelian barrier arbitrary thin and hence FSS-based QC can in principle be unlimitedly efficient (e.g. D/S may tend to zero when D tends to infinity).

  13. Context-aware computing and self-managing systems

    CERN Document Server

    Dargie, Waltenegus


    Bringing together an extensively researched area with an emerging research issue, Context-Aware Computing and Self-Managing Systems presents the core contributions of context-aware computing in the development of self-managing systems, including devices, applications, middleware, and networks. The expert contributors reveal the usefulness of context-aware computing in developing autonomous systems that have practical application in the real world.The first chapter of the book identifies features that are common to both context-aware computing and autonomous computing. It offers a basic definit

  14. Brain computer interfaces for neurorehabilitation – its current status as a rehabilitation strategy post-stroke. (United States)

    van Dokkum, L E H; Ward, T; Laffont, I


    The idea of using brain computer interfaces (BCI) for rehabilitation emerged relatively recently. Basically, BCI for neurorehabilitation involves the recording and decoding of local brain signals generated by the patient, as he/her tries to perform a particular task (even if imperfect), or during a mental imagery task. The main objective is to promote the recruitment of selected brain areas involved and to facilitate neural plasticity. The recorded signal can be used in several ways: (i) to objectify and strengthen motor imagery-based training, by providing the patient feedback on the imagined motor task, for example, in a virtual environment; (ii) to generate a desired motor task via functional electrical stimulation or rehabilitative robotic orthoses attached to the patient's limb – encouraging and optimizing task execution as well as "closing" the disrupted sensorimotor loop by giving the patient the appropriate sensory feedback; (iii) to understand cerebral reorganizations after lesion, in order to influence or even quantify plasticity-induced changes in brain networks. For example, applying cerebral stimulation to re-equilibrate inter-hemispheric imbalance as shown by functional recording of brain activity during movement may help recovery. Its potential usefulness for a patient population has been demonstrated on various levels and its diverseness in interface applications makes it adaptable to a large population. The position and status of these very new rehabilitation systems should now be considered with respect to our current and more or less validated traditional methods, as well as in the light of the wide range of possible brain damage. The heterogeneity in post-damage expression inevitably complicates the decoding of brain signals and thus their use in pathological conditions, asking for controlled clinical trials.

  15. Time computations in anuran auditory systems

    Directory of Open Access Journals (Sweden)

    Gary J Rose


    Full Text Available Temporal computations are important in the acoustic communication of anurans. In many cases, calls between closely related species are nearly identical spectrally but differ markedly in temporal structure. Depending on the species, calls can differ in pulse duration, shape and/or rate (i.e., amplitude modulation, direction and rate of frequency modulation, and overall call duration. Also, behavioral studies have shown that anurans are able to discriminate between calls that differ in temporal structure. In the peripheral auditory system, temporal information is coded primarily in the spatiotemporal patterns of activity of auditory-nerve fibers. However, major transformations in the representation of temporal information occur in the central auditory system. In this review I summarize recent advances in understanding how temporal information is represented in the anuran midbrain, with particular emphasis on mechanisms that underlie selectivity for pulse duration and pulse rate (i.e., intervals between onsets of successive pulses. Two types of neurons have been identified that show selectivity for pulse rate: long-interval cells respond well to slow pulse rates but fail to spike or respond phasically to fast pulse rates; conversely, interval-counting neurons respond to intermediate or fast pulse rates, but only after a threshold number of pulses, presented at optimal intervals, have occurred. Duration-selectivity is manifest as short-pass, band-pass or long-pass tuning. Whole-cell patch recordings, in vivo, suggest that excitation and inhibition are integrated in diverse ways to generate temporal selectivity. In many cases, activity-related enhancement or depression of excitatory or inhibitory processes appear to contribute to selective responses.

  16. Modelling, abstraction, and computation in systems biology: A view from computer science. (United States)

    Melham, Tom


    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. A Heterogeneous High-Performance System for Computational and Computer Science (United States)


    Science The views, opinions and/or findings contained in this report are those of the author(s) and should not contrued as an official Department of the...System for Computational and Computer Science Report Title This DoD HBC/MI Equipment/Instrumentation grant was awarded in October 2014 for the purchase...Computing (HPC) course taught in the department of computer science as to attract more graduate students from many disciplines where their research

  18. Computer controlled vent and pressurization system (United States)

    Cieslewicz, E. J.


    The Centaur space launch vehicle airborne computer, which was primarily used to perform guidance, navigation, and sequencing tasks, was further used to monitor and control inflight pressurization and venting of the cryogenic propellant tanks. Computer software flexibility also provided a failure detection and correction capability necessary to adopt and operate redundant hardware techniques and enhance the overall vehicle reliability.

  19. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.


    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...

  20. The hack attack - Increasing computer system awareness of vulnerability threats (United States)

    Quann, John; Belford, Peter


    The paper discusses the issue of electronic vulnerability of computer based systems supporting NASA Goddard Space Flight Center (GSFC) by unauthorized users. To test the security of the system and increase security awareness, NYMA, Inc. employed computer 'hackers' to attempt to infiltrate the system(s) under controlled conditions. Penetration procedures, methods, and descriptions are detailed in the paper. The procedure increased the security consciousness of GSFC management to the electronic vulnerability of the system(s).

  1. Diffusion current in a system of coupled Josephson junctions (United States)

    Shukrinov, Yu. M.; Rahmonov, I. R.


    The role of a diffusion current in the phase dynamics of a system of coupled Josephson junctions (JJs) has been analyzed. It is shown that, by studying the temporal dependences of the superconducting, quasi-particle, diffusion, and displacement currents and the dependences of average values of these currents on the total current, it is possible to explain the main features of the current-voltage characteristic (CVC) of the system. The effect of a diffusion current on the character of CVC branching in the vicinity of a critical current and in the region of hysteresis, as well as on the part of CVC branch corresponding to a parametric resonance in the system is demonstrated. A clear interpretation of the differences in the character of CVC branching in a model of capacitively coupled JJs (CCJJ model) and a model of capacitive coupling with diffusion current (CCJJ+DC model) is proposed. It is shown that a decrease in the diffusion current in a JJ leads to the switching of this junction to an oscillating state. The results of model calculations are qualitatively consistent with the experimental data.

  2. Diffusion current in a system of coupled Josephson junctions

    Energy Technology Data Exchange (ETDEWEB)

    Shukrinov, Yu. M., E-mail:; Rahmonov, I. R. [Joint Institute for Nuclear Research (Russian Federation)


    The role of a diffusion current in the phase dynamics of a system of coupled Josephson junctions (JJs) has been analyzed. It is shown that, by studying the temporal dependences of the superconducting, quasi-particle, diffusion, and displacement currents and the dependences of average values of these currents on the total current, it is possible to explain the main features of the current-voltage characteristic (CVC) of the system. The effect of a diffusion current on the character of CVC branching in the vicinity of a critical current and in the region of hysteresis, as well as on the part of CVC branch corresponding to a parametric resonance in the system is demonstrated. A clear interpretation of the differences in the character of CVC branching in a model of capacitively coupled JJs (CCJJ model) and a model of capacitive coupling with diffusion current (CCJJ+DC model) is proposed. It is shown that a decrease in the diffusion current in a JJ leads to the switching of this junction to an oscillating state. The results of model calculations are qualitatively consistent with the experimental data.

  3. Spin currents and magnetization dynamics in multilayer systems

    NARCIS (Netherlands)

    van der Bijl, E.


    In this Thesis the interplay between spin currents and magnetization dynamics is investigated theoretically. With the help of a simple model the relevant physical phenomena are introduced. From this model it can be deduced that in systems with small spin-orbit coupling, current-induced torques on


    Brown, J. W.


    PLAID is a three-dimensional Computer Aided Design (CAD) system which enables the user to interactively construct, manipulate, and display sets of highly complex geometric models. PLAID was initially developed by NASA to assist in the design of Space Shuttle crewstation panels, and the detection of payload object collisions. It has evolved into a more general program for convenient use in many engineering applications. Special effort was made to incorporate CAD techniques and features which minimize the users workload in designing and managing PLAID models. PLAID consists of three major modules: the Primitive Object Generator (BUILD), the Composite Object Generator (COG), and the DISPLAY Processor. The BUILD module provides a means of constructing simple geometric objects called primitives. The primitives are created from polygons which are defined either explicitly by vertex coordinates, or graphically by use of terminal crosshairs or a digitizer. Solid objects are constructed by combining, rotating, or translating the polygons. Corner rounding, hole punching, milling, and contouring are special features available in BUILD. The COG module hierarchically organizes and manipulates primitives and other previously defined COG objects to form complex assemblies. The composite object is constructed by applying transformations to simpler objects. The transformations which can be applied are scalings, rotations, and translations. These transformations may be defined explicitly or defined graphically using the interactive COG commands. The DISPLAY module enables the user to view COG assemblies from arbitrary viewpoints (inside or outside the object) both in wireframe and hidden line renderings. The PLAID projection of a three-dimensional object can be either orthographic or with perspective. A conflict analysis option enables detection of spatial conflicts or collisions. DISPLAY provides camera functions to simulate a view of the model through different lenses. Other

  5. West Coast Observing System (WCOS) ADCP Currents Data (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The West Coast Observing System (WCOS) project provides access to temperature and currents data collected at four of the five National Marine Sanctuary sites,...

  6. Population vulnerability of marine birds within the California Current System (United States)

    U.S. Geological Survey, Department of the Interior — Six metrics were used to determine Population Vulnerability: global population size, annual occurrence in the California Current System (CCS), percent of the...

  7. Population vulnerability of marine birds within the California Current System (United States)

    U.S. Geological Survey, Department of the Interior — Six metrics were used to determine Population Vulnerability: global population size, annual occurrence in the California Current System (CCS), percent of the...

  8. Research on Integrated Monitoring and Prevention System for Stray Current in Metro

    Institute of Scientific and Technical Information of China (English)

    李威; 严旭


    On the basis of analyzing the influencing factors and harmfulness of stray current, and discussing the existing problems of monitoring and prevention system for stray current, the integrated monitoring and prevention system for stray current in metro was developed. A net system of distributed computers for monitoring was set up. It can monitor the distribution of stray current in metro and the corrosion of the metal structure in the whole line. According to the situation of monitoring it can also control the drainage of its tank to reach the best effect and eliminate the negative effect of polarity drainage. By using the new type unilateral electric device, the problem of burning the rail by electric arc can be avoided. The unilateral electric device can be connected with the monitoring net system directly to realize the monitor in line and improve the reliability of the device.

  9. Integrated Computational System for Aerodynamic Steering and Visualization (United States)

    Hesselink, Lambertus


    In February of 1994, an effort from the Fluid Dynamics and Information Sciences Divisions at NASA Ames Research Center with McDonnel Douglas Aerospace Company and Stanford University was initiated to develop, demonstrate, validate and disseminate automated software for numerical aerodynamic simulation. The goal of the initiative was to develop a tri-discipline approach encompassing CFD, Intelligent Systems, and Automated Flow Feature Recognition to improve the utility of CFD in the design cycle. This approach would then be represented through an intelligent computational system which could accept an engineer's definition of a problem and construct an optimal and reliable CFD solution. Stanford University's role focused on developing technologies that advance visualization capabilities for analysis of CFD data, extract specific flow features useful for the design process, and compare CFD data with experimental data. During the years 1995-1997, Stanford University focused on developing techniques in the area of tensor visualization and flow feature extraction. Software libraries were created enabling feature extraction and exploration of tensor fields. As a proof of concept, a prototype system called the Integrated Computational System (ICS) was developed to demonstrate CFD design cycle. The current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will (1) briefly review the technologies developed during 1995-1997 (2) describe current technologies in the area of comparison techniques, (4) describe the theory of our new

  10. Overview of ASC Capability Computing System Governance Model

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott W. [Los Alamos National Laboratory


    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  11. High-Speed Computer-Controlled Switch-Matrix System (United States)

    Spisz, E.; Cory, B.; Ho, P.; Hoffman, M.


    High-speed computer-controlled switch-matrix system developed for communication satellites. Satellite system controlled by onboard computer and all message-routing functions between uplink and downlink beams handled by newly developed switch-matrix system. Message requires only 2-microsecond interconnect period, repeated every millisecond.

  12. Granular computing analysis and design of intelligent systems

    CERN Document Server

    Pedrycz, Witold


    Information granules, as encountered in natural language, are implicit in nature. To make them fully operational so they can be effectively used to analyze and design intelligent systems, information granules need to be made explicit. An emerging discipline, granular computing focuses on formalizing information granules and unifying them to create a coherent methodological and developmental environment for intelligent system design and analysis. Granular Computing: Analysis and Design of Intelligent Systems presents the unified principles of granular computing along with its comprehensive algo

  13. Current status and prospects of computational resources for natural product dereplication: a review. (United States)

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi


    Research in natural products has always enhanced drug discovery by providing new and unique chemical compounds. However, recently, drug discovery from natural products is slowed down by the increasing chance of re-isolating known compounds. Rapid identification of previously isolated compounds in an automated manner, called dereplication, steers researchers toward novel findings, thereby reducing the time and effort for identifying new drug leads. Dereplication identifies compounds by comparing processed experimental data with those of known compounds, and so, diverse computational resources such as databases and tools to process and compare compound data are necessary. Automating the dereplication process through the integration of computational resources has always been an aspired goal of natural product researchers. To increase the utilization of current computational resources for natural products, we first provide an overview of the dereplication process, and then list useful resources, categorizing into databases, methods and software tools and further explaining them from a dereplication perspective. Finally, we discuss the current challenges to automating dereplication and proposed solutions.

  14. A computational model of the ionic currents, Ca2+ dynamics and action potentials underlying contraction of isolated uterine smooth muscle.

    Directory of Open Access Journals (Sweden)

    Wing-Chiu Tong

    Full Text Available Uterine contractions during labor are discretely regulated by rhythmic action potentials (AP of varying duration and form that serve to determine calcium-dependent force production. We have employed a computational biology approach to develop a fuller understanding of the complexity of excitation-contraction (E-C coupling of uterine smooth muscle cells (USMC. Our overall aim is to establish a mathematical platform of sufficient biophysical detail to quantitatively describe known uterine E-C coupling parameters and thereby inform future empirical investigations of physiological and pathophysiological mechanisms governing normal and dysfunctional labors. From published and unpublished data we construct mathematical models for fourteen ionic currents of USMCs: Ca2+ currents (L- and T-type, Na+ current, an hyperpolarization-activated current, three voltage-gated K+ currents, two Ca2+-activated K+ current, Ca2+-activated Cl current, non-specific cation current, Na+-Ca2+ exchanger, Na+-K+ pump and background current. The magnitudes and kinetics of each current system in a spindle shaped single cell with a specified surface area:volume ratio is described by differential equations, in terms of maximal conductances, electrochemical gradient, voltage-dependent activation/inactivation gating variables and temporal changes in intracellular Ca2+ computed from known Ca2+ fluxes. These quantifications are validated by the reconstruction of the individual experimental ionic currents obtained under voltage-clamp. Phasic contraction is modeled in relation to the time constant of changing [Ca2+]i. This integrated model is validated by its reconstruction of the different USMC AP configurations (spikes, plateau and bursts of spikes, the change from bursting to plateau type AP produced by estradiol and of simultaneous experimental recordings of spontaneous AP, [Ca2+]i and phasic force. In summary, our advanced mathematical model provides a powerful tool to

  15. Computational Modeling of Flow Control Systems for Aerospace Vehicles Project (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop computational methods for designing active flow control systems on aerospace vehicles with the primary objective of...

  16. Simulation model of load balancing in distributed computing systems (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.


    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  17. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  18. Top 10 Threats to Computer Systems Include Professors and Students (United States)

    Young, Jeffrey R.


    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  19. Top 10 Threats to Computer Systems Include Professors and Students (United States)

    Young, Jeffrey R.


    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  20. The use of heat pipes in thermal control system for electronics: current situation and prospects

    Directory of Open Access Journals (Sweden)

    Khairnasov S. M.


    Full Text Available Today, the widespread application of cooling systems based on heat pipes makes significant contribution to the solution of the thermal control of electronic equipment. The use of heat pipes as heat transfer devices and heat exchanging equipment allows creating an efficient new-generation heat sinks. Nowadays, heat pipes are widely used in the following areas: electronic equipment, special application computer equipment (from small computers to large data centres, high power electronics. The article provides an analysis of the current state and prospects of heat pipes application in thermal control systems for ground-based electronic equipment.

  1. Bringing the CMS distributed computing system into scalable operations

    CERN Document Server

    Belforte, S; Fisk, I; Flix, J; Hernández, J M; Kress, T; Letts, J; Magini, N; Miccio, V; Sciabà, A


    Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure an...

  2. A Survey of Civilian Dental Computer Systems. (United States)


    r.arketplace, the orthodontic community continued to pioneer clinical automation through diagnosis, treat- (1) patient registration, identification...profession." New York State Dental Journal 34:76, 1968. 17. Ehrlich, A., The Role of Computers in Dental Practice Management. Champaign, IL: Colwell...Council on Dental military dental clinic. Medical Bulletin of the US Army Practice. Report: Dental Computer Vendors. 1984 Europe 39:14-16, 1982. 19

  3. Development of wide range current signal data acquisition system for reactivity meter using Keithley electrometers

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, S. H.; Kim, H. K.; Chio, Y. S.; Kim, M. J.; Woo, J. S. [KAERI, Taejon (Korea, Republic of)


    The reactivity worth of control rods is measured to ensure safety every refueling phase in HANARO, the research reactor in KAERI. Two compensated ion chambers are installed around the outer core to measure the reactor power. The signals from CICs enter the reactivity computer system. The reactivity computer system operated on MS-DOS was developed during the commissioning phase. But it is not so user-friendly, is so outdated and difficult to aquire spare parts. Hence we decided to upgrade the system to utilize MS-Windows {sup TM} operating system and object oriented visual program language. This paper describes the data acquisition system developed for the new reactivity computer system operated on MS-Windows{sup TM} operating system. This data acquisition system uses electrometers for converting low current signal to voltage and measures the current signal accurately even though the electrometer change the range of the output automatically. We verified that the system was stable and acquired the input signals accurately.

  4. Distributed computing environments for future space control systems (United States)

    Viallefont, Pierre


    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  5. A computational design system for rapid CFD analysis (United States)

    Ascoli, E. P.; Barson, S. L.; Decroix, M. E.; Sindir, Munir M.


    A computation design system (CDS) is described in which these tools are integrated in a modular fashion. This CDS ties together four key areas of computational analysis: description of geometry; grid generation; computational codes; and postprocessing. Integration of improved computational fluid dynamics (CFD) analysis tools through integration with the CDS has made a significant positive impact in the use of CFD for engineering design problems. Complex geometries are now analyzed on a frequent basis and with greater ease.

  6. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.


    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  7. Safety Metrics for Human-Computer Controlled Systems (United States)

    Leveson, Nancy G; Hatanaka, Iwao


    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  8. Computational system identification of continuous-time nonlinear systems using approximate Bayesian computation (United States)

    Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan


    In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.

  9. Tidal current turbine based on hydraulic transmission system

    Institute of Scientific and Technical Information of China (English)

    Hong-wei LIU; Wei LI; Yong-gang LIN; Shun MA


    Tidal current turbines (TCTs) are newly developed electricity generating devices.Aiming at the stabilization of the power output of TCTs,this paper introduces the hydraulic transmission technologies into TCTs.The hydrodynamics of the turbine was analyzed at first and its power output characteristics were predicted.A hydraulic power transmission system and a hydraulic pitch-controlled system were designed.Then related simulations were conducted.Finally,a TCT prototype was manufactured and tested in the workshop.The test results have confirmed the correctness of the current design and availability of installation of the hydraulic system in TCTs.

  10. Design technologies for green and sustainable computing systems

    CERN Document Server

    Ganguly, Amlan; Chakrabarty, Krishnendu


    This book provides a comprehensive guide to the design of sustainable and green computing systems (GSC). Coverage includes important breakthroughs in various aspects of GSC, including multi-core architectures, interconnection technology, data centers, high-performance computing (HPC), and sensor networks. The authors address the challenges of power efficiency and sustainability in various contexts, including system design, computer architecture, programming languages, compilers and networking. ·         Offers readers a single-source reference for addressing the challenges of power efficiency and sustainability in embedded computing systems; ·         Provides in-depth coverage of the key underlying design technologies for green and sustainable computing; ·         Covers a wide range of topics, from chip-level design to architectures, computing systems, and networks.

  11. A comparison of queueing, cluster and distributed computing systems (United States)

    Kaplan, Joseph A.; Nelson, Michael L.


    Using workstation clusters for distributed computing has become popular with the proliferation of inexpensive, powerful workstations. Workstation clusters offer both a cost effective alternative to batch processing and an easy entry into parallel computing. However, a number of workstations on a network does not constitute a cluster. Cluster management software is necessary to harness the collective computing power. A variety of cluster management and queuing systems are compared: Distributed Queueing Systems (DQS), Condor, Load Leveler, Load Balancer, Load Sharing Facility (LSF - formerly Utopia), Distributed Job Manager (DJM), Computing in Distributed Networked Environments (CODINE), and NQS/Exec. The systems differ in their design philosophy and implementation. Based on published reports on the different systems and conversations with the system's developers and vendors, a comparison of the systems are made on the integral issues of clustered computing.

  12. Modeling and strain gauging of eddy current repulsion deicing systems (United States)

    Smith, Samuel O.


    Work described in this paper confirms and extends work done by Zumwalt, et al., on a variety of in-flight deicing systems that use eddy current repulsion for repelling ice. Two such systems are known as electro-impulse deicing (EIDI) and the eddy current repulsion deicing strip (EDS). Mathematical models for these systems are discussed for their capabilities and limitations. The author duplicates a particular model of the EDS. Theoretical voltage, current, and force results are compared directly to experimental results. Dynamic strain measurements results are presented for the EDS system. Dynamic strain measurements near EDS or EIDI coils are complicated by the high magnetic fields in the vicinity of the coils. High magnetic fields induce false voltage signals out of the gages.

  13. Computer programs for the acquisition and analysis of eddy-current array probe data

    Energy Technology Data Exchange (ETDEWEB)

    Pate, J.R.; Dodd, C.V.


    Objective of the Improved Eddy-Curent ISI (in-service inspection) for Steam Generators Tubing program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for ISI of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC`s mobile NDE laboratory and staff. This report documents computer programs that were developed for acquisition of eddy-current data from specially designed 16-coil array probes. Complete code as well as instructions for use are provided.

  14. Nanocrystalline material in toroidal cores for current transformer: analytical study and computational simulations

    Directory of Open Access Journals (Sweden)

    Benedito Antonio Luciano


    Full Text Available Based on electrical and magnetic properties, such as saturation magnetization, initial permeability, and coercivity, in this work are presented some considerations about the possibilities of applications of nanocrystalline alloys in toroidal cores for current transformers. It is discussed how the magnetic characteristics of the core material affect the performance of the current transformer. From the magnetic characterization and the computational simulations, using the finite element method (FEM, it has been verified that, at the typical CT operation value of flux density, the nanocrystalline alloys properties reinforce the hypothesis that the use of these materials in measurement CT cores can reduce the ratio and phase errors and can also improve its accuracy class.

  15. An improved current potential method for fast computation of stellarator coil shapes

    CERN Document Server

    Landreman, Matt


    Several fast methods for computing stellarator coil shapes are compared, including the classical NESCOIL procedure [Merkel, Nucl. Fusion 27, 867 (1987)], its generalization using truncated singular value decomposition, and a Tikhonov regularization approach we call REGCOIL in which the squared current density is included in the objective function. Considering W7-X and NCSX geometries, and for any desired level of regularization, we find the REGCOIL approach simultaneously achieves lower surface-averaged and maximum values of both current density (on the coil winding surface) and normal magnetic field (on the desired plasma surface). This approach therefore can simultaneously improve the free-boundary reconstruction of the target plasma shape while substantially increasing the minimum distances between coils, preventing collisions between coils while improving access for ports and maintenance. The REGCOIL method also allows finer control over the level of regularization, and it eliminates two pathologies of NE...

  16. Computer Generated Hologram System for Wavefront Measurement System Calibration (United States)

    Olczak, Gene


    Computer Generated Holograms (CGHs) have been used for some time to calibrate interferometers that require nulling optics. A typical scenario is the testing of aspheric surfaces with an interferometer placed near the paraxial center of curvature. Existing CGH technology suffers from a reduced capacity to calibrate middle and high spatial frequencies. The root cause of this shortcoming is as follows: the CGH is not placed at an image conjugate of the asphere due to limitations imposed by the geometry of the test and the allowable size of the CGH. This innovation provides a calibration system where the imaging properties in calibration can be made comparable to the test configuration. Thus, if the test is designed to have good imaging properties, then middle and high spatial frequency errors in the test system can be well calibrated. The improved imaging properties are provided by a rudimentary auxiliary optic as part of the calibration system. The auxiliary optic is simple to characterize and align to the CGH. Use of the auxiliary optic also reduces the size of the CGH required for calibration and the density of the lines required for the CGH. The resulting CGH is less expensive than the existing technology and has reduced write error and alignment error sensitivities. This CGH system is suitable for any kind of calibration using an interferometer when high spatial resolution is required. It is especially well suited for tests that include segmented optical components or large apertures.

  17. The Cc1 Project – System For Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    J Chwastowski


    Full Text Available The main features of the Cloud Computing system developed at IFJ PAN are described. The project is financed from the structural resources provided by the European Commission and the Polish Ministry of Science and Higher Education (Innovative Economy, National Cohesion Strategy. The system delivers a solution for carrying out computer calculations on a Private Cloud computing infrastructure. It consists of an intuitive Web based user interface, a module for the users and resources administration and the standard EC2 interface implementation. Thanks to the distributed character of the system it allows for the integration of a geographically distant federation of computer clusters within a uniform user environment.

  18. National electronic medical records integration on cloud computing system. (United States)

    Mirza, Hebah; El-Masri, Samir


    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  19. Alternative treatment technology information center computer database system

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, D. [Environmental Protection Agency, Edison, NJ (United States)


    The Alternative Treatment Technology Information Center (ATTIC) computer database system was developed pursuant to the 1986 Superfund law amendments. It provides up-to-date information on innovative treatment technologies to clean up hazardous waste sites. ATTIC v2.0 provides access to several independent databases as well as a mechanism for retrieving full-text documents of key literature. It can be accessed with a personal computer and modem 24 hours a day, and there are no user fees. ATTIC provides {open_quotes}one-stop shopping{close_quotes} for information on alternative treatment options by accessing several databases: (1) treatment technology database; this contains abstracts from the literature on all types of treatment technologies, including biological, chemical, physical, and thermal methods. The best literature as viewed by experts is highlighted. (2) treatability study database; this provides performance information on technologies to remove contaminants from wastewaters and soils. It is derived from treatability studies. This database is available through ATTIC or separately as a disk that can be mailed to you. (3) underground storage tank database; this presents information on underground storage tank corrective actions, surface spills, emergency response, and remedial actions. (4) oil/chemical spill database; this provides abstracts on treatment and disposal of spilled oil and chemicals. In addition to these separate databases, ATTIC allows immediate access to other disk-based systems such as the Vendor Information System for Innovative Treatment Technologies (VISITT) and the Bioremediation in the Field Search System (BFSS). The user may download these programs to their own PC via a high-speed modem. Also via modem, users are able to download entire documents through the ATTIC system. Currently, about fifty publications are available, including Superfund Innovative Technology Evaluation (SITE) program documents.

  20. A new compensation current real-time computing method for power active filter based on double linear construction algorithm

    Institute of Scientific and Technical Information of China (English)

    LI; Zicheng; SUN; Yukun


    Considering the detection principle that "when load current is periodic current, the integral in a cycle for absolute value of load current subtracting fundamental active current is the least", harmonic current real-time detection methods for power active filter are proposed based on direct computation, simple iterative algorithm and optimal iterative algorithm. According to the direct computation method, the amplitude of the fundamental active current can be accurately calculated when load current is placed in stable state. The simple iterative algorithm and the optimal iterative algorithm provide an idea about judging the state of load current. On the basis of the direct computation method, the simple iterative algorithm, the optimal iterative algorithm and precise definition of the basic concepts such as the true amplitude of the fundamental active current when load current is placed in varying state, etc., the double linear construction idea is proposed in which the amplitude of the fundamental active current at the moment of the sample is accurately calculated by using the first linear construction and the condition which disposes the next sample is created by using the second linear construction. On the basis of the double linear construction idea, a harmonic current real-time detection method for power active filter is proposed based on the double linear construction algorithm. This method has the characteristics of small computing quantity, fine real-time performance, being capable of accurately calculating the amplitude of the fundamental active current and so on.

  1. Mechanisms of protection of information in computer networks and systems

    Directory of Open Access Journals (Sweden)

    Sergey Petrovich Evseev


    Full Text Available Protocols of information protection in computer networks and systems are investigated. The basic types of threats of infringement of the protection arising from the use of computer networks are classified. The basic mechanisms, services and variants of realization of cryptosystems for maintaining authentication, integrity and confidentiality of transmitted information are examined. Their advantages and drawbacks are described. Perspective directions of development of cryptographic transformations for the maintenance of information protection in computer networks and systems are defined and analyzed.

  2. An integrated computer control system for the ANU linac (United States)

    Davidson, P. M.; Foote, G. S.


    One facet of the installation of the superconducting linac at the ANU is the need for computer control of a variety of systems, such as beam transport, resonator RF, cryogenics and others. To accommodate this, a number of control interfaces (for example, analogue signals and RS232 serial lines) must be employed. Ideally, all of the systems should be able to be controlled from a central location, remote from the actual devices. To this end a system based around VAX computers and VME crates has been designed and is currently being developed and implemented. A VAXstation is used to issue control messages and perform high-level functions, while VME crates containing appropriate modules (primarily DACs, ADCs and digital I/O boards) control the devices. The controllers in the VME crates are AEON rtVAX modules running a real-time operating system. Communication with the VAXstation is via DECnet, on a private ethernet to allow communication rates unaffected by unrelated network activity and potentially increasing the security of the system by providing a possible network isolation point. Also on this ethernet are a number of terminal servers to control RS232 devices. A central database contains all device control and monitoring parameters. The main control process running on the VAXstation is responsible for maintaining the current values of the parameters in the database and for dispatching control messages to the appropriate VME crate or RS232 serial line. Separate graphical interface processes allow the operator to interact with the control process, communicating through shared memory. Many graphics processes can be active simultaneously, displaying either on a single or on multiple terminals. Software running on the rtVAX controllers handles the low-level device-specific control by translating messages from the main control process to VME commands which set hardware outputs on VME modules. Similarly, requests for the value of a parameter result in the rtVAX program

  3. Sensor fusion control system for computer integrated manufacturing

    CSIR Research Space (South Africa)

    Kumile, CM


    Full Text Available of products in unpredictable quantities. Computer Integrated Manufacturing (CIM) systems plays an important role towards integrating such flexible systems. This paper presents a methodology of increasing flexibility and reusability of a generic CIM cell...

  4. Entrepreneurial Health Informatics for Computer Science and Information Systems Students (United States)

    Lawler, James; Joseph, Anthony; Narula, Stuti


    Corporate entrepreneurship is a critical area of curricula for computer science and information systems students. Few institutions of computer science and information systems have entrepreneurship in the curricula however. This paper presents entrepreneurial health informatics as a course in a concentration of Technology Entrepreneurship at a…

  5. On the Computation of Lyapunov Functions for Interconnected Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer


    This paper addresses the computation of additively separable Lyapunov functions for interconnected systems. The presented results can be applied to reduce the complexity of the computations associated with stability analysis of large scale systems. We provide a necessary and sufficient condition...

  6. Software For Computer-Aided Design Of Control Systems (United States)

    Wette, Matthew


    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  7. 3-D Signal Processing in a Computer Vision System (United States)

    Dongping Zhu; Richard W. Conners; Philip A. Araman


    This paper discusses the problem of 3-dimensional image filtering in a computer vision system that would locate and identify internal structural failure. In particular, a 2-dimensional adaptive filter proposed by Unser has been extended to 3-dimension. In conjunction with segmentation and labeling, the new filter has been used in the computer vision system to...

  8. Exploiting Self-organization in Bioengineered Systems: A Computational Approach. (United States)

    Davis, Delin; Doloman, Anna; Podgorski, Gregory J; Vargis, Elizabeth; Flann, Nicholas S


    The productivity of bioengineered cell factories is limited by inefficiencies in nutrient delivery and waste and product removal. Current solution approaches explore changes in the physical configurations of the bioreactors. This work investigates the possibilities of exploiting self-organizing vascular networks to support producer cells within the factory. A computational model simulates de novo vascular development of endothelial-like cells and the resultant network functioning to deliver nutrients and extract product and waste from the cell culture. Microbial factories with vascular networks are evaluated for their scalability, robustness, and productivity compared to the cell factories without a vascular network. Initial studies demonstrate that at least an order of magnitude increase in production is possible, the system can be scaled up, and the self-organization of an efficient vascular network is robust. The work suggests that bioengineered multicellularity may offer efficiency improvements difficult to achieve with physical engineering approaches.

  9. New Computer Account Management System on 22 November

    CERN Multimedia

    IT Department


    On 22 November, the current management system called CRA was replaced by a new self-service tool available on a Web Portal. The End-Users can now manage their computer accounts and resources themselves through this Web Portal. The ServiceDesk will provide help or forward requests to the appropriate support line in case of specific requests. Account management tools The Account Management Portal allows you to: Manage your primary account; Change your password; Create and manage secondary and service accounts; Manage application and resource authorizations and settings; Find help and documentation concerning accounts and resources. Get Help In the event of any questions or problems, please contact the ServiceDesk (phone +41 22 767 8888 or The Account Management Team

  10. Experiments and simulation models of a basic computation element of an autonomous molecular computing system. (United States)

    Takinoue, Masahiro; Kiga, Daisuke; Shohda, Koh-Ichiroh; Suyama, Akira


    Autonomous DNA computers have been attracting much attention because of their ability to integrate into living cells. Autonomous DNA computers can process information through DNA molecules and their molecular reactions. We have already proposed an idea of an autonomous molecular computer with high computational ability, which is now named Reverse-transcription-and-TRanscription-based Autonomous Computing System (RTRACS). In this study, we first report an experimental demonstration of a basic computation element of RTRACS and a mathematical modeling method for RTRACS. We focus on an AND gate, which produces an output RNA molecule only when two input RNA molecules exist, because it is one of the most basic computation elements in RTRACS. Experimental results demonstrated that the basic computation element worked as designed. In addition, its behaviors were analyzed using a mathematical model describing the molecular reactions of the RTRACS computation elements. A comparison between experiments and simulations confirmed the validity of the mathematical modeling method. This study will accelerate construction of various kinds of computation elements and computational circuits of RTRACS, and thus advance the research on autonomous DNA computers.

  11. Mechatronic sensory system for computer integrated manufacturing

    CSIR Research Space (South Africa)

    Kumile, CM


    Full Text Available (CIM) systems plays an important role towards integrating such flexible systems. The requirement of fast and cheap design and redesign of manufacturing systems therefore is gaining in importance, considering not only the products and the physical...

  12. MALDI imaging mass spectrometry: statistical data analysis and current computational challenges

    Directory of Open Access Journals (Sweden)

    Alexandrov Theodore


    Full Text Available Abstract Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF imaging mass spectrometry, also called MALDI-imaging, is a label-free bioanalytical technique used for spatially-resolved chemical analysis of a sample. Usually, MALDI-imaging is exploited for analysis of a specially prepared tissue section thaw mounted onto glass slide. A tremendous development of the MALDI-imaging technique has been observed during the last decade. Currently, it is one of the most promising innovative measurement techniques in biochemistry and a powerful and versatile tool for spatially-resolved chemical analysis of diverse sample types ranging from biological and plant tissues to bio and polymer thin films. In this paper, we outline computational methods for analyzing MALDI-imaging data with the emphasis on multivariate statistical methods, discuss their pros and cons, and give recommendations on their application. The methods of unsupervised data mining as well as supervised classification methods for biomarker discovery are elucidated. We also present a high-throughput computational pipeline for interpretation of MALDI-imaging data using spatial segmentation. Finally, we discuss current challenges associated with the statistical analysis of MALDI-imaging data.

  13. Impact of new computing systems on computational mechanics and flight-vehicle structures technology (United States)

    Noor, A. K.; Storaasli, O. O.; Fulton, R. E.


    Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.

  14. Anesthesia information management systems marketplace and current vendors. (United States)

    Stonemetz, Jerry


    This article addresses the brief history of anesthesia information management systems (AIMS) and discusses the vendors that currently market AIMS. The current market penetration based on the information provided by these vendors is presented and the rationale for the purchase of AIMS is discussed. The considerations to be evaluated when making a vendor selection are also discussed. Copyright © 2011 Elsevier Inc. All rights reserved.


    Institute of Scientific and Technical Information of China (English)

    Yang Binfeng; Luo Feilu; Cao Xiongheng; Xu Xiaojie


    A theory model is established to describe the voltage-current response function. The peak amplitude and the zero-crossing time of the transient signal is extracted as the imaging features, array pulsed eddy current (PEC) imaging is proposed to detect corrosion. The test results show that this system has the advantage of fast scanning speed, different imaging mode and quantitative detection, it has a broad application in the aviation nondestructive testing.

  16. Data systems and computer science programs: Overview (United States)

    Smith, Paul H.; Hunter, Paul


    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  17. Central Computer IMS Processing System (CIMS). (United States)

    Wolfe, Howard

    As part of the IMS Version 3 tryout in 1971-72, software was developed to enable data submitted by IMS users to be transmitted to the central computer, which acted on the data to create IMS reports and to update the Pupil Data Base with criterion exercise and class roster information. The program logic is described, and the subroutines and…

  18. Cloud Computing Based E-Learning System (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.


    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  19. Cloud Computing Based E-Learning System (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.


    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  20. Distributed Computation Resources for Earth System Grid Federation (ESGF) (United States)

    Duffy, D.; Doutriaux, C.; Williams, D. N.


    The Intergovernmental Panel on Climate Change (IPCC), prompted by the United Nations General Assembly, has published a series of papers in their Fifth Assessment Report (AR5) on processes, impacts, and mitigations of climate change in 2013. The science used in these reports was generated by an international group of domain experts. They studied various scenarios of climate change through the use of highly complex computer models to simulate the Earth's climate over long periods of time. The resulting total data of approximately five petabytes are stored in a distributed data grid known as the Earth System Grid Federation (ESGF). Through the ESGF, consumers of the data can find and download data with limited capabilities for server-side processing. The Sixth Assessment Report (AR6) is already in the planning stages and is estimated to create as much as two orders of magnitude more data than the AR5 distributed archive. It is clear that data analysis capabilities currently in use will be inadequate to allow for the necessary science to be done with AR6 data—the data will just be too big. A major paradigm shift from downloading data to local systems to perform data analytics must evolve to moving the analysis routines to the data and performing these computations on distributed platforms. In preparation for this need, the ESGF has started a Compute Working Team (CWT) to create solutions that allow users to perform distributed, high-performance data analytics on the AR6 data. The team will be designing and developing a general Application Programming Interface (API) to enable highly parallel, server-side processing throughout the ESGF data grid. This API will be integrated with multiple analysis and visualization tools, such as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), netCDF Operator (NCO), and others. This presentation will provide an update on the ESGF CWT's overall approach toward enabling the necessary storage proximal computational

  1. The current status of the development of the technology on 3D computer simulation in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Reyoung; Park, Seung Kook; Chung, Un Soo; Jung, Ki Jung


    The development background and property of the COSIDA, which is the 3D computer simulation system for the analysis on the dismantling procedure of the nuclear facilities in Japan was reviewed. The function of the visualization on the work area, Kinematics analysis and dismantling scenario analysis, which are the sub systems of the COSIDA, has been investigated. The physical, geometrical and radiological properties were modelled in 2D or 3D in the sub system of the visualization of the work area. In the sub system of the kinematics analysis, the command set on the basic work procedure for the control of the motion of the models at a cyber space was driven. The suitability of the command set was estimated by the application of COSIDA to the programming on the motion of the remote dismantling tools for dismantling the components of the nuclear facilities at cyber space.

  2. Nonequilibrium Microscopic Distribution of Thermal Current in Particle Systems

    KAUST Repository

    Yukawa, Satoshi


    A nonequilibrium distribution function of microscopic thermal current is studied by a direct numerical simulation in a thermal conducting steady state of particle systems. Two characteristic temperatures of the thermal current are investigated on the basis of the distribution. It is confirmed that the temperature depends on the current direction; Parallel temperature to the heat-flux is higher than antiparallel one. The difference between the parallel temperature and the antiparallel one is proportional to a macroscopic temperature gradient. ©2009 The Physical Society of Japan.

  3. Evaluation of computer-based ultrasonic inservice inspection systems

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T. [Pacific Northwest Lab., Richland, WA (United States)


    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems.

  4. Output Current Ripple Reduction Algorithms for Home Energy Storage Systems

    Directory of Open Access Journals (Sweden)

    Jin-Hyuk Park


    Full Text Available This paper proposes an output current ripple reduction algorithm using a proportional-integral (PI controller for an energy storage system (ESS. In single-phase systems, the DC/AC inverter has a second-order harmonic at twice the grid frequency of a DC-link voltage caused by pulsation of the DC-link voltage. The output current of a DC/DC converter has a ripple component because of the ripple of the DC-link voltage. The second-order harmonic adversely affects the battery lifetime. The proposed algorithm has an advantage of reducing the second-order harmonic of the output current in the variable frequency system. The proposed algorithm is verified from the PSIM simulation and experiment with the 3 kW ESS model.

  5. Computer graphics application in the engineering design integration system (United States)

    Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.


    The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.

  6. Security for small computer systems a practical guide for users

    CERN Document Server

    Saddington, Tricia


    Security for Small Computer Systems: A Practical Guide for Users is a guidebook for security concerns for small computers. The book provides security advice for the end-users of small computers in different aspects of computing security. Chapter 1 discusses the security and threats, and Chapter 2 covers the physical aspect of computer security. The text also talks about the protection of data, and then deals with the defenses against fraud. Survival planning and risk assessment are also encompassed. The last chapter tackles security management from an organizational perspective. The bo

  7. Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system. (United States)

    Takada, Naoki; Shimobaba, Tomoyoshi; Nakayama, Hirotaka; Shiraki, Atsushi; Okada, Naohisa; Oikawa, Minoru; Masuda, Nobuyuki; Ito, Tomoyoshi


    To overcome the computational complexity of a computer-generated hologram (CGH), we implement an optimized CGH computation in our multi-graphics processing unit cluster system. Our system can calculate a CGH of 6,400×3,072 pixels from a three-dimensional (3D) object composed of 2,048 points in 55 ms. Furthermore, in the case of a 3D object composed of 4096 points, our system is 553 times faster than a conventional central processing unit (using eight threads).

  8. Computational fluid dynamics for turbomachinery internal air systems. (United States)

    Chew, John W; Hills, Nicholas J


    Considerable progress in development and application of computational fluid dynamics (CFD) for aeroengine internal flow systems has been made in recent years. CFD is regularly used in industry for assessment of air systems, and the performance of CFD for basic axisymmetric rotor/rotor and stator/rotor disc cavities with radial throughflow is largely understood and documented. Incorporation of three-dimensional geometrical features and calculation of unsteady flows are becoming commonplace. Automation of CFD, coupling with thermal models of the solid components, and extension of CFD models to include both air system and main gas path flows are current areas of development. CFD is also being used as a research tool to investigate a number of flow phenomena that are not yet fully understood. These include buoyancy-affected flows in rotating cavities, rim seal flows and mixed air/oil flows. Large eddy simulation has shown considerable promise for the buoyancy-driven flows and its use for air system flows is expected to expand in the future.

  9. Evaluation of Current Controllers for Distributed Power Generation Systems

    DEFF Research Database (Denmark)

    Timbus, Adrian; Liserre, Marco; Teodorescu, Remus


    This paper discusses the evaluation of different current controllers employed for grid-connected distributed power generation systems having variable input power, such as wind turbines and photovoltaic systems. The focus is mainly set on linear controllers such as proportional....... First, in steady-state conditions, the contribution of controllers to the total harmonic distortion of the grid current is pursued. Further on, the behavior of controllers in the case of transient conditions like input power variations and grid voltage faults is also examined. Experimental results...

  10. Return-current formation in the electron beam – plasma system

    Directory of Open Access Journals (Sweden)

    M. Bárta


    Full Text Available Using a 3-D electromagnetic particle-in-cell model an evolution of the electron distribution function in the beam-plasma system with the return current is computed. It was found that the resulting electron distribution function depends on the magnetic field assumed along the beam-propagation direction. While for small magnetic fields the electron distribution function becomes broad in the direction perpendicular to the beam propagation due to the Weibel (filamentation instability and the return current is formed by a shifted bulk distribution, for stronger magnetic fields the distribution, especially on the return current side, is extended in the beam-propagation direction. To understand better the instabilities influencing the mentioned processes, the dispersion diagrams are computed and discussed.

  11. Software fault tolerance in computer operating systems (United States)

    Iyer, Ravishankar K.; Lee, Inhwan


    This chapter provides data and analysis of the dependability and fault tolerance for three operating systems: the Tandem/GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Based on measurements from these systems, basic software error characteristics are investigated. Fault tolerance in operating systems resulting from the use of process pairs and recovery routines is evaluated. Two levels of models are developed to analyze error and recovery processes inside an operating system and interactions among multiple instances of an operating system running in a distributed environment. The measurements show that the use of process pairs in Tandem systems, which was originally intended for tolerating hardware faults, allows the system to tolerate about 70% of defects in system software that result in processor failures. The loose coupling between processors which results in the backup execution (the processor state and the sequence of events occurring) being different from the original execution is a major reason for the measured software fault tolerance. The IBM/MVS system fault tolerance almost doubles when recovery routines are provided, in comparison to the case in which no recovery routines are available. However, even when recovery routines are provided, there is almost a 50% chance of system failure when critical system jobs are involved.

  12. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    Directory of Open Access Journals (Sweden)

    Yamashiro T


    Full Text Available Tsuneo Yamashiro,1 Tetsuhiro Miyara,1 Osamu Honda,2 Noriyuki Tomiyama,2 Yoshiharu Ohno,3 Satoshi Noma,4 Sadayuki Murayama1 On behalf of the ACTIve Study Group 1Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, Nishihara, Okinawa, Japan; 2Department of Radiology, Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 3Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Hyogo, Japan; 4Department of Radiology, Tenri Hospital, Tenri, Nara, Japan Purpose: To assess the advantages of iterative reconstruction for quantitative computed tomography (CT analysis of pulmonary emphysema. Materials and methods: Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < -950 Hounsfield units and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results: Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001. For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01, but was not

  13. TRL Computer System User’s Guide

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Dalton, Angela C.


    We have developed a wiki-based graphical user-interface system that implements our technology readiness level (TRL) uncertainty models. This document contains the instructions for using this wiki-based system.

  14. Computer Sciences and Data Systems, volume 1 (United States)


    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  15. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems (United States)

    Kao, David L.; Chan, William M.


    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.


    Directory of Open Access Journals (Sweden)

    Sunil Kr Singh


    Full Text Available With the emergence of ubiquitous computing, whole scenario of computing has been changed. It affected many inter disciplinary fields. This paper visions the impact of ubiquitous computing on video surveillance system. With increase in population and highly specific security areas, intelligent monitoring is the major requirement of modern world .The paper describes the evolution of surveillance system from analog to multi sensor ubiquitous system. It mentions the demand of context based architectures. It draws the benefit of merging of cloud computing to boost the surveillance system and at the same time reducing cost and maintenance. It analyzes some surveillance system architectures which are made for ubiquitous deployment. It provides major challenges and opportunities for the researchers to make surveillance system highly efficient and make them seamlessly embed in our environments.

  17. Information Hiding based Trusted Computing System Design (United States)


    and the environment where the system operates (electrical network frequency signals), and how to improve the trust in a wireless sensor network with...the system (silicon PUF) and the environment where the system operates (ENF signals). We also study how to improve the trust in a wireless sensor...Harbin Institute of Technology, Shenzhen , China, May 26, 2013. (Host: Prof. Aijiao Cui) 13) “Designing Trusted Energy-Efficient Circuits and Systems

  18. Computer Aided-Diagnosis of Prostate Cancer on Multiparametric MRI: A Technical Review of Current Research

    Directory of Open Access Journals (Sweden)

    Shijun Wang


    Full Text Available Prostate cancer (PCa is the most commonly diagnosed cancer among men in the United States. In this paper, we survey computer aided-diagnosis (CADx systems that use multiparametric magnetic resonance imaging (MP-MRI for detection and diagnosis of prostate cancer. We review and list mainstream techniques that are commonly utilized in image segmentation, registration, feature extraction, and classification. The performances of 15 state-of-the-art prostate CADx systems are compared through the area under their receiver operating characteristic curves (AUC. Challenges and potential directions to further the research of prostate CADx are discussed in this paper. Further improvements should be investigated to make prostate CADx systems useful in clinical practice.

  19. Computer aided-diagnosis of prostate cancer on multiparametric MRI: a technical review of current research. (United States)

    Wang, Shijun; Burtt, Karen; Turkbey, Baris; Choyke, Peter; Summers, Ronald M


    Prostate cancer (PCa) is the most commonly diagnosed cancer among men in the United States. In this paper, we survey computer aided-diagnosis (CADx) systems that use multiparametric magnetic resonance imaging (MP-MRI) for detection and diagnosis of prostate cancer. We review and list mainstream techniques that are commonly utilized in image segmentation, registration, feature extraction, and classification. The performances of 15 state-of-the-art prostate CADx systems are compared through the area under their receiver operating characteristic curves (AUC). Challenges and potential directions to further the research of prostate CADx are discussed in this paper. Further improvements should be investigated to make prostate CADx systems useful in clinical practice.

  20. Alternating Current All-electrical Gun Control System in Tanks

    Directory of Open Access Journals (Sweden)

    Zang Kemao


    Full Text Available The ac all-electrical gun control system is composed of permanent magnetic synchronous machine-drive control systems and the ball-screw by replacing the complicated electrohydraulic systems. At the same time, the variable-structure system with sliding modes makes the gun control systems to have higher performances using the only rate flexure gyroscope. Thereby, vehicle hull gyroscope and angular gyroscope are left out.The new ac all-electrical gun control systems developed are reduced by 40 per cent in weight, decreased by 30 per cent in volume, increased by 35 per cent in efficiency, and enhanced by three times in service life as compared to the current gun control systems.

  1. A new on-line leakage current monitoring system of ZnO surge arresters

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Bok-Hee [Research Center for Next-Generation High Voltage and Power Technology, Inha University, 253 Yonghyun-dong, Nam-ku, Incheon 402-751 (Korea, Republic of)]. E-mail:; Kang, Sung-Man [Research Center for Next-Generation High Voltage and Power Technology, Inha University, 253 Yonghyun-dong, Nam-ku, Incheon 402-751 (Korea, Republic of)


    This paper presents a new on-line leakage current monitoring system of zinc oxide (ZnO) surge arresters. To effectively diagnose the deterioration of ZnO surge arresters, a new algorithm and on-line leakage current detection device, which uses the time-delay addition method, for discriminating the resistive and capacitive currents was developed to use in the aging test and durability evaluation for ZnO arrester blocks. A computer-based measurement system of the resistive leakage current, the on-line monitoring device can detect accurately the leakage currents flowing through ZnO surge arresters for power frequency ac applied voltages. The proposed on-line leakage current monitoring device of ZnO surge arresters is more highly sensitive and gives more linear response than the existing devices using the detection method of the third harmonic leakage currents. Therefore, the proposed leakage current monitoring device can be useful for predicting the defects and performance deterioration of ZnO surge arresters in power system applications.

  2. Current status of nuclear power plant I and C systems (2000)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Young; Park, J.H.; Lee, J.S. and others


    Analog type I and C Systems of Nuclear Power Plants are being replaced by digital type systems because of the aging problems of the I and C systems. New NPPs have adopted computer-based digital I and C systems because the economical efficiency and the usability of the systems become higher than the analog I and C systems. However, the digital I and C systems have not been applied to NPPs because the reliability of computer systems and software has not been validated. The research works for reliability of the systems have been performed in many institutions. In this study, we reviewed the current status of I and C systems for advanced NPPs that have developed in Korea as well as in other countries until this year. We hope to use the result of this study to plan for a localization of NPP I and C systems. In this study, the I and C systems of advanced reactors such as AP600 and NUPLEX 80+ of U.S.A, CANDU 9 of Canada, APWRs and ABWRs of Japan, N4 of France, and KNGR, KALIMER, and SMART of Korea were reviewed. We reviewed the nuclear policy of U.S.A and Europe, and the NPP digital I and C systems developed in many international research institutes. Using this result, we extracted items to be researched and classified those by types of reactors. Then, we established the localization method of NPP digital I and C systems.

  3. Evaluation of Computer-Based Procedure System Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Johanna Oxstrand; Katya Le Blanc; Seth Hays


    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE), performed in close collaboration with industry R&D programs, to provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The introduction of advanced technology in existing nuclear power plants may help to manage the effects of aging systems, structures, and components. In addition, the incorporation of advanced technology in the existing LWR fleet may entice the future workforce, who will be familiar with advanced technology, to work for these utilities rather than more newly built nuclear power plants. Advantages are being sought by developing and deploying technologies that will increase safety and efficiency. One significant opportunity for existing plants to increase efficiency is to phase out the paper-based procedures (PBPs) currently used at most nuclear power plants and replace them, where feasible, with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information

  4. Current fluctuations in stochastic systems with long-range memory

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R J; Touchette, H [School of Mathematical Sciences, Queen Mary University of London, Mile End Road, London, E1 4NS (United Kingdom)], E-mail:, E-mail:


    We propose a method to calculate the large deviations of current fluctuations in a class of stochastic particle systems with history-dependent rates. Long-range temporal correlations are seen to alter the speed of the large deviation function in analogy with long-range spatial correlations in equilibrium systems. We give some illuminating examples and discuss the applicability of the Gallavotti-Cohen fluctuation theorem. (fast track communication)

  5. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)


    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  6. Automated fermentation equipment. 2. Computer-fermentor system

    Energy Technology Data Exchange (ETDEWEB)

    Nyeste, L.; Szigeti, L.; Veres, A.; Pungor, E. Jr.; Kurucz, I.; Hollo, J.


    An inexpensive computer-operated system suitable for data collection and steady-state optimum control of fermentation processes is presented. With this system, minimum generation time has been determined as a function of temperature and pH in the turbidostat cultivation of a yeast strain. The applicability of the computer-fermentor system is also presented by the determination of the dynamic Kla value.

  7. Managing trust in information systems by using computer simulations


    Zupančič, Eva


    Human factor is more and more important in new information systems and it should be also taken into consideration when developing new systems. Trust issues, which are tightly tied to human factor, are becoming an important topic in computer science. In this work we research trust in IT systems and present computer-based trust management solutions. After a review of qualitative and quantitative methods for trust management, a precise description of a simulation tool for trust management ana...

  8. Personal Computer System for Automatic Coronary Venous Flow Measurement


    Dew, Robert B.


    We developed an automated system based on an IBM PC/XT Personal computer to measure coronary venous blood flow during cardiac catheterization. Flow is determined by a thermodilution technique in which a cold saline solution is infused through a catheter into the coronary venous system. Regional temperature fluctuations sensed by the catheter are used to determine great cardiac vein and coronary sinus blood flow. The computer system replaces manual methods of acquiring and analyzing temperatur...

  9. Improving the safety features of general practice computer systems


    Anthony Avery; Boki Savelyich; Sheila Teasdale


    General practice computer systems already have a number of important safety features. However, there are problems in that general practitioners (GPs) have come to rely on hazard alerts when they are not foolproof. Furthermore, GPs do not know how to make best use of safety features on their systems. There are a number of solutions that could help to improve the safety features of general practice computer systems and also help to improve the abilities of healthcare professionals to use these ...

  10. Multiple-User, Multitasking, Virtual-Memory Computer System (United States)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.


    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  11. Computational dosimetry for grounded and ungrounded human models due to contact current (United States)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao


    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm2.

  12. Performance Models for Split-execution Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; McCaskey, Alex [ORNL; Schrock, Jonathan [ORNL; Seddiqi, Hadayat [ORNL; Britt, Keith A [ORNL; Imam, Neena [ORNL


    Split-execution computing leverages the capabilities of multiple computational models to solve problems, but splitting program execution across different computational models incurs costs associated with the translation between domains. We analyze the performance of a split-execution computing system developed from conventional and quantum processing units (QPUs) by using behavioral models that track resource usage. We focus on asymmetric processing models built using conventional CPUs and a family of special-purpose QPUs that employ quantum computing principles. Our performance models account for the translation of a classical optimization problem into the physical representation required by the quantum processor while also accounting for hardware limitations and conventional processor speed and memory. We conclude that the bottleneck in this split-execution computing system lies at the quantum-classical interface and that the primary time cost is independent of quantum processor behavior.

  13. Intelligent decision support systems for sustainable computing paradigms and applications

    CERN Document Server

    Abraham, Ajith; Siarry, Patrick; Sheng, Michael


    This unique book dicusses the latest research, innovative ideas, challenges and computational intelligence (CI) solutions in sustainable computing. It presents novel, in-depth fundamental research on achieving a sustainable lifestyle for society, either from a methodological or from an application perspective. Sustainable computing has expanded to become a significant research area covering the fields of computer science and engineering, electrical engineering and other engineering disciplines, and there has been an increase in the amount of literature on aspects sustainable computing such as energy efficiency and natural resources conservation that emphasizes the role of ICT (information and communications technology) in achieving system design and operation objectives. The energy impact/design of more efficient IT infrastructures is a key challenge in realizing new computing paradigms. The book explores the uses of computational intelligence (CI) techniques for intelligent decision support that can be explo...

  14. Resource requirements for digital computations on electrooptical systems. (United States)

    Eshaghian, M M; Panda, D K; Kumar, V K


    In this paper we study the resource requirements of electrooptical organizations in performing digital computing tasks. We define a generic model of parallel computation using optical interconnects, called the optical model of computation (OMC). In this model, computation is performed in digital electronics and communication is performed using free space optics. Using this model we derive relationships between information transfer and computational resources in solving a given problem. To illustrate our results, we concentrate on a computationally intensive operation, 2-D digital image convolution. Irrespective of the input/output scheme and the order of computation, we show a lower bound of ?(nw) on the optical volume required for convolving a w x w kernel with an n x n image, if the input bits are given to the system only once.

  15. Resource requirements for digital computations on electrooptical systems (United States)

    Eshaghian, Mary M.; Panda, Dhabaleswar K.; Kumar, V. K. Prasanna


    The resource requirements of electrooptical organizations in performing digital computing tasks are studied via a generic model of parallel computation using optical interconnects, called the 'optical model of computation' (OMC). In this model, computation is performed in digital electronics and communication is performed using free space optics. Relationships between information transfer and computational resources in solving a given problem are derived. A computationally intensive operation, two-dimensional digital image convolution is undertaken. Irrespective of the input/output scheme and the order of computation, a lower bound of Omega(nw) is obtained on the optical volume required for convolving a w x w kernel with an n x n image, if the input bits are given to the system only once.

  16. 3-dimensional current collection model. [of Tethered Satellite System 1 (United States)

    Hwang, Kai-Shen; Shiah, A.; Wu, S. T.; Stone, N.


    A three-dimensional, time dependent current collection model of a satellite has been developed for the TSS-1 system. The system has been simulated particularly for the Research of Plasma Electrodynamics (ROPE) experiment. The Maxwellian distributed particles with the geomagnetic field effects are applied in this numerical simulation. The preliminary results indicate that a ring current is observed surrounding the satellite in the equatorial plane. This ring current is found between the plasma sheath and the satellite surface and is oscillating with a time scale of approximately 1 microsec. This is equivalent to the electron plasma frequency. An hour glass shape of electron distribution was observed when the viewing direction is perpendicular to the equatorial plane. This result is consistent with previous findings from Linson (1969) and Antoniades et al. (1990). Electrons that are absorbed by the satellite are limited from the background ionosphere as indicated by Parker and Murphy (1967).

  17. 14 CFR 417.123 - Computing systems and software. (United States)


    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  18. Design of Computer Fault Diagnosis and Troubleshooting System ...

    African Journals Online (AJOL)



    Dec 1, 2013 ... We model our system using Object-Oriented Analysis and Design. (OOAD) and UML ... high-level concept of a system. ... on the design of an expert system for computer .... opened distributed application, has rich type system ...

  19. Systems, methods and computer-readable media to model kinetic performance of rechargeable electrochemical devices (United States)

    Gering, Kevin L.


    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics. The computing system also analyzes the cell information of the electrochemical cell with a Butler-Volmer (BV) expression modified to determine exchange current density of the electrochemical cell by including kinetic performance information related to pulse-time dependence, electrode surface availability, or a combination thereof. A set of sigmoid-based expressions may be included with the modified-BV expression to determine kinetic performance as a function of pulse time. The determined exchange current density may be used with the modified-BV expression, with or without the sigmoid expressions, to analyze other characteristics of the electrochemical cell. Model parameters can be defined in terms of cell aging, making the overall kinetics model amenable to predictive estimates of cell kinetic performance along the aging timeline.

  20. LLNL current meter array--concept and system description

    Energy Technology Data Exchange (ETDEWEB)

    Mantrom, D.D. [Lawrence Livermore National Lab., CA (United States)


    A measurement capability using a horizontal array of 10 S4 current meters mounted on a stiff floating structure with 35 m aperture has been developed to support interpretation of radar imaging of surface effects associated with internal waves. This system has been fielded three times and most recently, has collected data alongside the sea-surface footprint of a land-fixed radar imaging ship-generated internal waves. The underlying need for this measurement capability is described. The specifications resulting from this need are presented and the engineering design and deployment procedures of the platform and systems that resulted are described The current meter data are multiplexed along with meteorological and system status data on board the floating platform and are telemetered to a shore station and on to a data acquisition system. The raw data are recorded, and are then processed to form space-time images of current and strain rate (a spatial derivative of the current field). Examples of raw and processed data associated with ship-generated internal waves are presented.