WorldWideScience

Sample records for automatic capture management

  1. Gestalt perceptual organization of visual stimuli captures attention automatically: Electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Francesco Marini

    2016-08-01

    Full Text Available The visual system leverages organizational regularities of perceptual elements to create meaningful representations of the world. One clear example of such function, which has been formalized in the Gestalt psychology principles, is the perceptual grouping of simple visual elements (e.g., lines and arcs into unitary objects (e.g., forms and shapes. The present study sought to characterize automatic attentional capture and related cognitive processing of Gestalt-like visual stimuli at the psychophysiological level by using event-related potentials (ERPs. We measured ERPs during a simple visual reaction time task with bilateral presentations of physically matched elements with or without a Gestalt organization. Results showed that Gestalt (vs. non-Gestalt stimuli are characterized by a larger N2pc together with enhanced ERP amplitudes of non-lateralized components (N1, N2, P3 starting around 150ms post-stimulus onset. Thus, we conclude that Gestalt stimuli capture attention automatically and entail characteristic psychophysiological signatures at both early and late processing stages.

  2. Evaluating automatic attentional capture by self-relevant information.

    Science.gov (United States)

    Ocampo, Brenda; Kahan, Todd A

    2016-01-01

    Our everyday decisions and memories are inadvertently influenced by self-relevant information. For example, we are faster and more accurate at making perceptual judgments about stimuli associated with ourselves, such as our own face or name, as compared with familiar non-self-relevant stimuli. Humphreys and Sui propose a "self-attention network" to account for these effects, wherein self-relevant stimuli automatically capture our attention and subsequently enhance the perceptual processing of self-relevant information. We propose that the masked priming paradigm and continuous flash suppression represent two ways to experimentally examine these controversial claims.

  3. Automatic Capture Verification in Pacemakers (Autocapture – Utility and Problems

    Directory of Open Access Journals (Sweden)

    Ruth Kam

    2004-04-01

    Full Text Available The concept of a closed – loop feedback system, that would automatically assess pacing threshold and self -adjust pacing output to ensure consistent myocardial capture, has many appeals. Enhancing patient safety in cases of an unexpected rise in threshold, reduced current drain, hence prolonging battery longevity and reducing the amount of physician intervention required are just some of the advantages. Autocapture (AC is a proprietary algorithm developed by St Jude Medical CRMD, Sylmar, CA, USA, (SJM that was the first to commercially provide these automatic functions in a single chamber pacemaker (Microny and Regency, and subsequently in a dual chamber pacemaker (Affinity, Entity and Identity family of pacemakers. This article reviews the conditions necessary for AC verification and performance and the problems encountered in clinical practice.

  4. Development of automatic techniques for GPS data management

    International Nuclear Information System (INIS)

    Park, Pil Ho

    2001-06-01

    It is necessary for GPS center to establish automatization as effective management of GPS network including data gathering, data transformation, data backup, data sending to IGS (International GPS Service for geodynamics), and precise ephemerides gathering. The operating program of GPS center has been adopted at KCSC (Korea Cadastral Survey Corporation), NGI (National Geography Institute), MOMAF (Ministry of Maritime Affairs and Fisheries) without self-development of core technique. The automatic management of GPS network is consists of GPS data management and data processing. It is also fundamental technique, which should be accomplished by every GPS centers. Therefore, this study carried out analyzing of Japanese GPS center, which has accomplished automatization by module considering applicability for domestic GPS centers

  5. Effective speed management through automatic enforcement.

    NARCIS (Netherlands)

    Oei, H.-l.

    1994-01-01

    This paper analyses several aspects of the Dutch experience of speed enforcement, and presents the results of some speed management experiments in The Netherlands, using automatic warning of speeders and enforcement of speeding. Traditional approaches to manage speed there have not resulted in

  6. Sustainable Capture: Concepts for Managing Stream-Aquifer Systems.

    Science.gov (United States)

    Davids, Jeffrey C; Mehl, Steffen W

    2015-01-01

    Most surface water bodies (i.e., streams, lakes, etc.) are connected to the groundwater system to some degree so that changes to surface water bodies (either diversions or importations) can change flows in aquifer systems, and pumping from an aquifer can reduce discharge to, or induce additional recharge from streams, springs, and lakes. The timescales of these interactions are often very long (decades), making sustainable management of these systems difficult if relying only on observations of system responses. Instead, management scenarios are often analyzed based on numerical modeling. In this paper we propose a framework and metrics that can be used to relate the Theis concepts of capture to sustainable measures of stream-aquifer systems. We introduce four concepts: Sustainable Capture Fractions, Sustainable Capture Thresholds, Capture Efficiency, and Sustainable Groundwater Storage that can be used as the basis for developing metrics for sustainable management of stream-aquifer systems. We demonstrate their utility on a hypothetical stream-aquifer system where pumping captures both streamflow and discharge to phreatophytes at different amounts based on pumping location. In particular, Capture Efficiency (CE) can be easily understood by both scientists and non-scientist alike, and readily identifies vulnerabilities to sustainable stream-aquifer management when its value exceeds 100%. © 2014, National Ground Water Association.

  7. The Associate Principal Astronomer for AI Management of Automatic Telescopes

    Science.gov (United States)

    Henry, Gregory W.

    1998-01-01

    This research program in scheduling and management of automatic telescopes had the following objectives: 1. To field test the 1993 Automatic Telescope Instruction Set (ATIS93) programming language, which was specifically developed to allow real-time control of an automatic telescope via an artificial intelligence scheduler running on a remote computer. 2. To develop and test the procedures for two-way communication between a telescope controller and remote scheduler via the Internet. 3. To test various concepts in Al scheduling being developed at NASA Ames Research Center on an automatic telescope operated by Tennessee State University at the Fairborn Observatory site in southern Arizona. and 4. To develop a prototype software package, dubbed the Associate Principal Astronomer, for the efficient scheduling and management of automatic telescopes.

  8. Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

    Data.gov (United States)

    National Aeronautics and Space Administration — Enhance capabilities for collaborative data analysis and modeling in Earth sciences. Develop components for automatic workflow capture, archiving and management....

  9. Capturing the Interpersonal Implications of Evolved Preferences? Frequency of Sex Shapes Automatic, but Not Explicit, Partner Evaluations.

    Science.gov (United States)

    Hicks, Lindsey L; McNulty, James K; Meltzer, Andrea L; Olson, Michael A

    2016-06-01

    A strong predisposition to engage in sexual intercourse likely evolved in humans because sex is crucial to reproduction. Given that meeting interpersonal preferences tends to promote positive relationship evaluations, sex within a relationship should be positively associated with relationship satisfaction. Nevertheless, prior research has been inconclusive in demonstrating such a link, with longitudinal and experimental studies showing no association between sexual frequency and relationship satisfaction. Crucially, though, all prior research has utilized explicit reports of satisfaction, which reflect deliberative processes that may override the more automatic implications of phylogenetically older evolved preferences. Accordingly, capturing the implications of sexual frequency for relationship evaluations may require implicit measurements that bypass deliberative reasoning. Consistent with this idea, one cross-sectional and one 3-year study of newlywed couples revealed a positive association between sexual frequency and automatic partner evaluations but not explicit satisfaction. These findings highlight the importance of automatic measurements to understanding interpersonal relationships. © The Author(s) 2016.

  10. Knowledge Capture and Management for Space Flight Systems

    Science.gov (United States)

    Goodman, John L.

    2005-01-01

    The incorporation of knowledge capture and knowledge management strategies early in the development phase of an exploration program is necessary for safe and successful missions of human and robotic exploration vehicles over the life of a program. Following the transition from the development to the flight phase, loss of underlying theory and rationale governing design and requirements occur through a number of mechanisms. This degrades the quality of engineering work resulting in increased life cycle costs and risk to mission success and safety of flight. Due to budget constraints, concerned personnel in legacy programs often have to improvise methods for knowledge capture and management using existing, but often sub-optimal, information technology and archival resources. Application of advanced information technology to perform knowledge capture and management would be most effective if program wide requirements are defined at the beginning of a program.

  11. Automatic Rail Extraction and Celarance Check with a Point Cloud Captured by Mls in a Railway

    Science.gov (United States)

    Niina, Y.; Honma, R.; Honma, Y.; Kondo, K.; Tsuji, K.; Hiramatsu, T.; Oketani, E.

    2018-05-01

    Recently, MLS (Mobile Laser Scanning) has been successfully used in a road maintenance. In this paper, we present the application of MLS for the inspection of clearance along railway tracks of West Japan Railway Company. Point clouds around the track are captured by MLS mounted on a bogie and rail position can be determined by matching the shape of the ideal rail head with respect to the point cloud by ICP algorithm. A clearance check is executed automatically with virtual clearance model laid along the extracted rail. As a result of evaluation, the accuracy of extracting rail positions is less than 3 mm. With respect to the automatic clearance check, the objects inside the clearance and the ones related to a contact line is successfully detected by visual confirmation.

  12. Automatic Management of Parallel and Distributed System Resources

    Science.gov (United States)

    Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.

    1990-01-01

    Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.

  13. A new method for automatic tracking of facial landmarks in 3D motion captured images (4D).

    Science.gov (United States)

    Al-Anezi, T; Khambay, B; Peng, M J; O'Leary, E; Ju, X; Ayoub, A

    2013-01-01

    The aim of this study was to validate the automatic tracking of facial landmarks in 3D image sequences. 32 subjects (16 males and 16 females) aged 18-35 years were recruited. 23 anthropometric landmarks were marked on the face of each subject with non-permanent ink using a 0.5mm pen. The subjects were asked to perform three facial animations (maximal smile, lip purse and cheek puff) from rest position. Each animation was captured by the 3D imaging system. A single operator manually digitised the landmarks on the 3D facial models and their locations were compared with those of the automatically tracked ones. To investigate the accuracy of manual digitisation, the operator re-digitised the same set of 3D images of 10 subjects (5 male and 5 female) at 1 month interval. The discrepancies in x, y and z coordinates between the 3D position of the manual digitised landmarks and that of the automatic tracked facial landmarks were within 0.17mm. The mean distance between the manually digitised and the automatically tracked landmarks using the tracking software was within 0.55 mm. The automatic tracking of facial landmarks demonstrated satisfactory accuracy which would facilitate the analysis of the dynamic motion during facial animations. Copyright © 2012 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  14. Development of a digital automatic control law for steep glideslope capture and flare

    Science.gov (United States)

    Halyo, N.

    1977-01-01

    A longitudinal digital guidance and control law for steep glideslopes using MLS (Microwave Landing System) data is developed for CTOL aircraft using modern estimation and control techniques. The control law covers the final approach phases of glideslope capture, glideslope tracking, and flare to touchdown for automatic landings under adverse weather conditions. The control law uses a constant gain Kalman filter to process MLS and body-mounted accelerometer data to form estimates of flight path errors and wind velocities including wind shear. The flight path error estimates and wind estimates are used for feedback in generating control surface commands. Results of a digital simulation of the aircraft dynamics and the guidance and control law are presented for various wind conditions.

  15. Automatic road traffic safety management system in urban areas

    Directory of Open Access Journals (Sweden)

    Oskarbski Jacek

    2017-01-01

    Full Text Available Traffic incidents and accidents contribute to decreasing levels of transport system reliability and safety. Traffic management and emergency systems on the road, using, among others, automatic detection, video surveillance, communication technologies and institutional solutions improve the organization of the work of various departments involved in traffic and safety management. Automation of incident management helps to reduce the time of a rescue operation as well as of the normalization of the flow of traffic after completion of a rescue operation, which also affects the reduction of the risk of secondary accidents and contributes to reducing their severity. The paper presents the possibility of including city traffic departments in the process of incident management. The results of research on the automatic incident detection in cities are also presented.

  16. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  17. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  18. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  19. Development of an optimal automatic control law and filter algorithm for steep glideslope capture and glideslope tracking

    Science.gov (United States)

    Halyo, N.

    1976-01-01

    A digital automatic control law to capture a steep glideslope and track the glideslope to a specified altitude is developed for the longitudinal/vertical dynamics of a CTOL aircraft using modern estimation and control techniques. The control law uses a constant gain Kalman filter to process guidance information from the microwave landing system, and acceleration from body mounted accelerometer data. The filter outputs navigation data and wind velocity estimates which are used in controlling the aircraft. Results from a digital simulation of the aircraft dynamics and the control law are presented for various wind conditions.

  20. Automatic generation of a subject-specific model for accurate markerless motion capture and biomechanical applications.

    Science.gov (United States)

    Corazza, Stefano; Gambaretto, Emiliano; Mündermann, Lars; Andriacchi, Thomas P

    2010-04-01

    A novel approach for the automatic generation of a subject-specific model consisting of morphological and joint location information is described. The aim is to address the need for efficient and accurate model generation for markerless motion capture (MMC) and biomechanical studies. The algorithm applied and expanded on previous work on human shapes space by embedding location information for ten joint centers in a subject-specific free-form surface. The optimal locations of joint centers in the 3-D mesh were learned through linear regression over a set of nine subjects whose joint centers were known. The model was shown to be sufficiently accurate for both kinematic (joint centers) and morphological (shape of the body) information to allow accurate tracking with MMC systems. The automatic model generation algorithm was applied to 3-D meshes of different quality and resolution such as laser scans and visual hulls. The complete method was tested using nine subjects of different gender, body mass index (BMI), age, and ethnicity. Experimental training error and cross-validation errors were 19 and 25 mm, respectively, on average over the joints of the ten subjects analyzed in the study.

  1. SAFETY MANAGEMENT FOR WOMEN THROUGH AUTOMATIC GPS LOCATION TRACKER

    OpenAIRE

    P.Nivetha*1, S.Kiruthika2 & J.B.Kavitha3

    2018-01-01

    The project “SAFETY MANAGEMENT FOR WOMEN THROUGH AUTOMATIC GPS LOCATION TRACKER” is designed using Standard Android 4.0.3 platform. The platform used to develop the application is Eclipse IDE (Mars) with Java 1.6 Standard Edition. It’s an android app which will help people in their crucial time. For example if a person is in trouble and he needs a help so there should be an app through which he/she can contact with their one to help them by just clicking on one button, it will automatically s...

  2. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Science.gov (United States)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  3. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  4. Improvement of automatic fish feeder machine design

    Science.gov (United States)

    Chui Wei, How; Salleh, S. M.; Ezree, Abdullah Mohd; Zaman, I.; Hatta, M. H.; Zain, B. A. Md; Mahzan, S.; Rahman, M. N. A.; Mahmud, W. A. W.

    2017-10-01

    Nation Plan of action for management of fishing is target to achieve an efficient, equitable and transparent management of fishing capacity in marine capture fisheries by 2018. However, several factors influence the fishery production and efficiency of marine system such as automatic fish feeder machine could be taken in consideration. Two latest fish feeder machines have been chosen as the reference for this study. Based on the observation, it has found that the both machine was made with heavy structure, low water and temperature resistance materials. This research’s objective is to develop the automatic feeder machine to increase the efficiency of fish feeding. The experiment has conducted to testing the new design of machine. The new machine with maximum storage of 5 kg and functioning with two DC motors. This machine able to distribute 500 grams of pellets within 90 seconds and longest distance of 4.7 meter. The higher speed could reduce time needed and increase the distance as well. The minimum speed range for both motor is 110 and 120 with same full speed range of 255.

  5. Nearly automatic motion capture system for tracking octopus arm movements in 3D space.

    Science.gov (United States)

    Zelman, Ido; Galun, Meirav; Akselrod-Ballin, Ayelet; Yekutieli, Yoram; Hochner, Binyamin; Flash, Tamar

    2009-08-30

    Tracking animal movements in 3D space is an essential part of many biomechanical studies. The most popular technique for human motion capture uses markers placed on the skin which are tracked by a dedicated system. However, this technique may be inadequate for tracking animal movements, especially when it is impossible to attach markers to the animal's body either because of its size or shape or because of the environment in which the animal performs its movements. Attaching markers to an animal's body may also alter its behavior. Here we present a nearly automatic markerless motion capture system that overcomes these problems and successfully tracks octopus arm movements in 3D space. The system is based on three successive tracking and processing stages. The first stage uses a recently presented segmentation algorithm to detect the movement in a pair of video sequences recorded by two calibrated cameras. In the second stage, the results of the first stage are processed to produce 2D skeletal representations of the moving arm. Finally, the 2D skeletons are used to reconstruct the octopus arm movement as a sequence of 3D curves varying in time. Motion tracking, segmentation and reconstruction are especially difficult problems in the case of octopus arm movements because of the deformable, non-rigid structure of the octopus arm and the underwater environment in which it moves. Our successful results suggest that the motion-tracking system presented here may be used for tracking other elongated objects.

  6. Can wireless technology enable new diabetes management tools?

    Science.gov (United States)

    Hedtke, Paul A

    2008-01-01

    Mobile computing and communications technology embodied in the modern cell phone device can be employed to improve the lives of diabetes patients by giving them better tools for self-management. Several companies are working on the development of diabetes management tools that leverage the ubiquitous cell phone to bring self-management tools to the hand of the diabetes patient. Integration of blood glucose monitoring (BGM) technology with the cell phone platform adds a level of convenience for the person with diabetes, but, more importantly, allows BGM data to be automatically captured, logged, and processed in near real time in order to provide the diabetes patient with assistance in managing their blood glucose levels. Other automatic measurements can estimate physical activity, and information regarding medication events and food intake can be captured and analyzed in order to provide the diabetes patient with continual assistance in managing their therapy and behaviors in order to improve glycemic control. The path to realization of such solutions is not, however, without obstacles.

  7. Customer relationship management captures intellectual capital for increased competitiveness

    Directory of Open Access Journals (Sweden)

    C. R. Van Zyl

    2005-12-01

    Full Text Available Today, with regards to tangible assets, the corporate playing field has become more or less level with competing organisations producing very similar products and services. The key differentiator for an organisation's offerings now depends upon an organisation's ability to capture and leverage intellectual capital (IC, and especially customer IC. Customers are an invaluable source of two kinds of IC: transactional and innovative. An organisation must implement customer relationship management (CRM initiatives in order to develop and maintain good relationships with customers and in so doing, be able to capture IC. This IC will enable an organisation to be more responsive to new and changing customer needs and preferences and to be better able to customize products and services according to more specific customer profiles: ultimately leading to increased market share, profitability and overall strategic competitiveness. The purpose of this article is to determine how good customer relationships allow for the capture and subsequent leveraging of customer IC for increased competitiveness. In order to fulfill this purpose, the concept of CRM is explored as well as how CRM allows for the capture of both transactional and innovative capital. The strategic benefits of the application of customer IC are then explored, together with an exposition of the CRM implementation challenges facing those organisations that wish to implement a CRM program to capture and leverage customer IC for increased competitiveness. This exploration involved an examination of contemporary literature, theories and business cases and subsequently revealed that CRM is a vital discipline/philosophy that must be implemented by any organisation wishing to achieve greater market efficiency and competitiveness. This competitiveness can only be achieved through the carefully managed unlocking, sharing and leveraging of both transactional and innovative customer intellectual capital.

  8. 77 FR 19408 - Dynamic Mobility Applications and Data Capture Management Programs; Notice of Public Meeting

    Science.gov (United States)

    2012-03-30

    ... DEPARTMENT OF TRANSPORTATION Dynamic Mobility Applications and Data Capture Management Programs... stakeholders an update on the Data Capture and Management (DCM) and Dynamic Mobility Applications (DMA... critical issues designed to garner stakeholder feedback. About the Dynamic Mobility Application and Data...

  9. Manual editing of automatically recorded data in an anesthesia information management system.

    Science.gov (United States)

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  10. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00473067; The ATLAS collaboration; Serfon, Cedric; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas; Javurek, Tomas

    2017-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration, has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence...

  11. Electronic data capture and DICOM data management in multi-center clinical trials

    Science.gov (United States)

    Haak, Daniel; Page, Charles-E.; Deserno, Thomas M.

    2016-03-01

    Providing eligibility, efficacy and security evaluation by quantitative and qualitative disease findings, medical imaging has become increasingly important in clinical trials. Here, subject's data is today captured in electronic case reports forms (eCRFs), which are offered by electronic data capture (EDC) systems. However, integration of subject's medical image data into eCRFs is insufficiently supported. Neither integration of subject's digital imaging and communications in medicine (DICOM) data, nor communication with picture archiving and communication systems (PACS), is possible. This aggravates the workflow of the study personnel, in special regarding studies with distributed data capture in multiple sites. Hence, in this work, a system architecture is presented, which connects an EDC system, a PACS and a DICOM viewer via the web access to DICOM objects (WADO) protocol. The architecture is implemented using the open source tools OpenClinica, DCM4CHEE and Weasis. The eCRF forms the primary endpoint for the study personnel, where subject's image data is stored and retrieved. Background communication with the PACS is completely hidden for the users. Data privacy and consistency is ensured by automatic de-identification and re-labelling of DICOM data with context information (e.g. study and subject identifiers), respectively. The system is exemplarily demonstrated in a clinical trial, where computer tomography (CT) data is de-centrally captured from the subjects and centrally read by a chief radiologists to decide on inclusion of the subjects in the trial. Errors, latency and costs in the EDC workflow are reduced, while, a research database is implicitly built up in the background.

  12. Automatic creation of simulation configuration. The SIPA workshop: SWORD

    International Nuclear Information System (INIS)

    Oudot, G.; Valembois, A.

    1994-01-01

    SWORD (Software Workshop Oriented towards Research and Development) is not only a software management system but also and mainly a software development system. The SWORD workshop is organised in hierarchical levels: (1) the automatic or manual creation of elementary models based on FORTRAN ANSI standard language; these models have interface variables structured in so-called connection points; Automatic model generators are used for the simulation of standard, repeated equipment: HYTHERNET covers the simulation of hydraulic, thermal, chemistry and activity; CONTRONET covers the simulation of I and C system, i.e. logic, protection and control systems; The capture of system topology for both generators is carried out on a graphic workstation under CAD system, (2) The models assembly generator, in charge of linking models (via connection points) and organizing their calling sequence in order to create a simulation application, (3) The configurations in charge of creation of external environment and of links between models assembly and external environment (connection with control desk, plant computer system, safety parameter display etc.), (4) The configuration generator which exports the simulation configuration to the target machine and generates the appropriate command for compilations and link editions; The workshop Administration ensures management, consistency checks are carried out at each step with warnings generated when applicable, and automatic chaining of the appropriate commands according to engineer request are available. (orig.) (4 refs., 4 figs.)

  13. Automatic sign language recognition inspired by human sign perception

    NARCIS (Netherlands)

    Ten Holt, G.A.

    2010-01-01

    Automatic sign language recognition is a relatively new field of research (since ca. 1990). Its objectives are to automatically analyze sign language utterances. There are several issues within the research area that merit investigation: how to capture the utterances (cameras, magnetic sensors,

  14. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas

    2016-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for over a year. However, with the forthcoming start of run-2 and its expected workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing data imbalances as well as storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only ...

  15. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  16. Transportation management center data capture for performance and mobility measures guidebook.

    Science.gov (United States)

    2013-03-01

    The Guide to Transportation Management Center (TMC) Data Capture for Performance and Mobility Measures is a two-volume document consisting of this summary Guidebook and a Reference Manual. These documents provide technical guidance and recommended pr...

  17. Optimising the application of multiple-capture traps for invasive species management using spatial simulation.

    Science.gov (United States)

    Warburton, Bruce; Gormley, Andrew M

    2015-01-01

    Internationally, invasive vertebrate species pose a significant threat to biodiversity, agricultural production and human health. To manage these species a wide range of tools, including traps, are used. In New Zealand, brushtail possums (Trichosurus vulpecula), stoats (Mustela ermine), and ship rats (Rattus rattus) are invasive and there is an ongoing demand for cost-effective non-toxic methods for controlling these pests. Recently, traps with multiple-capture capability have been developed which, because they do not require regular operator-checking, are purported to be more cost-effective than traditional single-capture traps. However, when pest populations are being maintained at low densities (as is typical of orchestrated pest management programmes) it remains uncertain if it is more cost-effective to use fewer multiple-capture traps or more single-capture traps. To address this uncertainty, we used an individual-based spatially explicit modelling approach to determine the likely maximum animal-captures per trap, given stated pest densities and defined times traps are left between checks. In the simulation, single- or multiple-capture traps were spaced according to best practice pest-control guidelines. For possums with maintenance densities set at the lowest level (i.e. 0.5/ha), 98% of all simulated possums were captured with only a single capacity trap set at each site. When possum density was increased to moderate levels of 3/ha, having a capacity of three captures per trap caught 97% of all simulated possums. Results were similar for stoats, although only two potential captures per site were sufficient to capture 99% of simulated stoats. For rats, which were simulated at their typically higher densities, even a six-capture capacity per trap site only resulted in 80% kill. Depending on target species, prevailing density and extent of immigration, the most cost-effective strategy for pest control in New Zealand might be to deploy several single-capture

  18. Failure of the extended contingent attentional capture account in multimodal settings

    NARCIS (Netherlands)

    van der Lubbe, Robert Henricus Johannes; van der Helden, J.

    2006-01-01

    Sudden changes in our environment like sound bursts or light flashes are thought to automatically attract our attention thereby affecting responses to subsequent targets, although an alternative view (the contingent attentional capture account) holds that stimuli only capture our attention when they

  19. [OISO, automatic treatment of patients management in oncogenetics].

    Science.gov (United States)

    Guien, Céline; Fabre, Aurélie; Lagarde, Arnaud; Salgado, David; Gensollen-Thiriez, Catherine; Zattara, Hélène; Beroud, Christophe; Olschwang, Sylviane

    Oncogenetics is a long-term process, which requires a close relation between patients and medical teams, good familial links allowing lifetime follow-up. Numerous documents are exchanged in between the medical team, which has to frequently interact. We present here a new tool that has been conceived specifically for this management. The tool has been developed according to a model-view-controler approach with the relational system PostgreSQL 9.3. The web site used PHP 5.3, HTML5 and CSS3 languages, completed with JavaScript and jQuery-AJAX functions and two additional modules, FPDF and PHPMailer. The tool allows multiple interactions, clinical data management, mailing and emailing, follow-up plannings. Requests are able to follow all patients and planning automatically, to send information to a large number of patients or physicians, and to report activity. The tool has been designed for oncogenetics and adapted to its different aspects. The CNIL delivered an authorization for use. Secured web access allows the management at a regional level. Its simple concept makes it evolutive according to the constant updates of genetic and clinical management of patients. Copyright © 2017 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  20. Team collaborative innovation management based on primary pipes automatic welding project

    International Nuclear Information System (INIS)

    Li Jing; Wang Dong; Zhang Ke

    2012-01-01

    The welding quality of primary pipe directly affects the safe operation of nuclear power plants. Primary pipe automatic welding, first of its kind in China, is a complex systematic project involving many facets, such as design, manufacturing, material, and on-site construction. A R and D team was formed by China Guangdong Nuclear Power Engineering Co., Ltd. (CNPEC) together with other domestic nuclear power design institutes, and manufacturing and construction enterprises. According to the characteristics of nuclear power plant construction, and adopting team collaborative innovation management mode, through project co-ordination, resources allocation and building production, education and research collaborative innovation platform, CNPEC successfully developed the primary pipe automatic welding technique which has been widely applied to the construction of nuclear power plant, creating considerable economic benefits. (authors)

  1. ModelMage: a tool for automatic model generation, selection and management.

    Science.gov (United States)

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software.

  2. Transportation management center data capture for performance and mobility measures reference manual.

    Science.gov (United States)

    2013-03-01

    The Guide to Transportation Management Center (TMC) Data Capture for Performance and Mobility Measures is a two-volume document consisting of a summary Guidebook and this Reference Manual. These documents provide technical guidance and recommended pr...

  3. Black rhinoceros Diceros bicomis capture, transportation and boma management by the Natal Parks Board

    Directory of Open Access Journals (Sweden)

    R.R. Henwood

    1989-10-01

    Full Text Available Selected Papers from the Rhinoceros Conservation Workshop, Skukuza, Kruger National Park,31 August – 4 September 1988 The procedure used by the Natal Parks Board in the capture of black rhinoceros Diceros bicomis minor (Drummond, 1876 is outlined. It is emphasised that a successful capture operation requires careful planning and should not be attempted by the uninitiated or by parties who have little or no experience. Dosages of drugs are given, the darting and actual capture procedures highlighted, and aspects of transport and practical boma management are described.

  4. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    Science.gov (United States)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  5. Automatic supervision and fault detection of PV systems based on power losses analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chouder, A.; Silvestre, S. [Electronic Engineering Department, Universitat Politecnica de Catalunya, C/Jordi Girona 1-3, Campus Nord UPC, 08034 Barcelona (Spain)

    2010-10-15

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L{sub ct}) and miscellaneous capture losses (L{sub cm}). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R{sub C} and R{sub V}. Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally. (author)

  6. Automatic supervision and fault detection of PV systems based on power losses analysis

    International Nuclear Information System (INIS)

    Chouder, A.; Silvestre, S.

    2010-01-01

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L ct ) and miscellaneous capture losses (L cm ). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R C and R V . Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally.

  7. Comparison of automatic traps to capture mosquitoes (Diptera: Culicidae in rural areas in the tropical Atlantic rainforest

    Directory of Open Access Journals (Sweden)

    Ivy Luizi Rodrigues de Sa

    2013-12-01

    Full Text Available In several countries, surveillance of insect vectors is accomplished with automatic traps. This study addressed the performance of Mosquito Magnet® Independence (MMI in comparison with those of CDC with CO2 and lactic acid (CDC-A and CDC light trap (CDC-LT. The collection sites were in a rural region located in a fragment of secondary tropical Atlantic rainforest, southeastern Brazil. Limatus durhami and Limatus flavisetosus were the dominant species in the MMI, whereas Ochlerotatus scapularis was most abundant in CDC-A. Culex ribeirensis and Culex sacchettae were dominant species in the CDC-LT. Comparisons among traps were based on diversity indices. Results from the diversity analyses showed that the MMI captured a higher abundance of mosquitoes and that the species richness estimated with it was higher than with CDC-LT. Contrasting, difference between MMI and CDC-A was not statistically significant. Consequently, the latter trap seems to be both an alternative for the MMI and complementary to it for ecological studies and entomological surveillance.

  8. Automatically high accurate and efficient photomask defects management solution for advanced lithography manufacture

    Science.gov (United States)

    Zhu, Jun; Chen, Lijun; Ma, Lantao; Li, Dejian; Jiang, Wei; Pan, Lihong; Shen, Huiting; Jia, Hongmin; Hsiang, Chingyun; Cheng, Guojie; Ling, Li; Chen, Shijie; Wang, Jun; Liao, Wenkui; Zhang, Gary

    2014-04-01

    Defect review is a time consuming job. Human error makes result inconsistent. The defects located on don't care area would not hurt the yield and no need to review them such as defects on dark area. However, critical area defects can impact yield dramatically and need more attention to review them such as defects on clear area. With decrease in integrated circuit dimensions, mask defects are always thousands detected during inspection even more. Traditional manual or simple classification approaches are unable to meet efficient and accuracy requirement. This paper focuses on automatic defect management and classification solution using image output of Lasertec inspection equipment and Anchor pattern centric image process technology. The number of mask defect found during an inspection is always in the range of thousands or even more. This system can handle large number defects with quick and accurate defect classification result. Our experiment includes Die to Die and Single Die modes. The classification accuracy can reach 87.4% and 93.3%. No critical or printable defects are missing in our test cases. The missing classification defects are 0.25% and 0.24% in Die to Die mode and Single Die mode. This kind of missing rate is encouraging and acceptable to apply on production line. The result can be output and reloaded back to inspection machine to have further review. This step helps users to validate some unsure defects with clear and magnification images when captured images can't provide enough information to make judgment. This system effectively reduces expensive inline defect review time. As a fully inline automated defect management solution, the system could be compatible with current inspection approach and integrated with optical simulation even scoring function and guide wafer level defect inspection.

  9. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  10. Adoption of automatic identification systems by grocery retailersin the Johannesburg area

    Directory of Open Access Journals (Sweden)

    Christopher C. Darlington

    2011-11-01

    Full Text Available Retailers not only need the right data capture technology to meet the requirements of their applications, they must also decide on what the optimum technology is from the different symbologies that have been developed over the years. Automatic identification systems (AIS are a priority to decision makers as they attempt to obtain the best blend of equipment to ensure greater loss prevention and higher reliability in data capture. However there is a risk of having too simplistic a view of adopting AIS, since no one solution is applicable across an industry or business model. This problem is addressed through an exploratory, descriptive study, where the nature and value of AIS adoption by grocery retailers in the Johannesburg area is interrogated. Mixed empirical results indicate that, as retailers adopt AIS in order to improve their supply chain management systems, different types of applications are associated with various constraints and opportunities. Overall this study is in line with previous research that supports the notion that supply chain decisions are of a strategic nature even though efficient management of information is a day-to-day business operational decision.

  11. Development and preliminary validation of an automatic digital ...

    African Journals Online (AJOL)

    Amanda Chulayo

    2017-10-02

    Oct 2, 2017 ... based on an automatic digital analysis system (ADAS) that allows the capture of a series of real-time ... image analysing technology, with the accelerated advance of the hardware and software ..... Enables use of car battery.

  12. An Automatic Video Meteor Observation Using UFO Capture at the Showa Station

    Science.gov (United States)

    Fujiwara, Y.; Nakamura, T.; Ejiri, M.; Suzuki, H.

    2012-05-01

    The goal of our study is to clarify meteor activities in the southern hemi-sphere by continuous optical observations with video cameras with automatic meteor detection and recording at Syowa station, Antarctica.

  13. Neural correlates of an early attentional capture by positive distractor words.

    Science.gov (United States)

    Hinojosa, José A; Mercado, Francisco; Albert, Jacobo; Barjola, Paloma; Peláez, Irene; Villalba-García, Cristina; Carretié, Luis

    2015-01-01

    Exogenous or automatic attention to emotional distractors has been observed for emotional scenes and faces. In the language domain, however, automatic attention capture by emotional words has been scarcely investigated. In the current event-related potentials study we explored distractor effects elicited by positive, negative and neutral words in a concurrent but distinct target distractor paradigm. Specifically, participants performed a digit categorization task in which task-irrelevant words were flanked by numbers. The results of both temporo-spatial principal component and source location analyses revealed the existence of early distractor effects that were specifically triggered by positive words. At the scalp level, task-irrelevant positive compared to neutral and negative words elicited larger amplitudes in an anterior negative component that peaked around 120 ms. Also, at the voxel level, positive distractor words increased activity in orbitofrontal regions compared to negative words. These results suggest that positive distractor words quickly and automatically capture attentional resources diverting them from the task where attention was voluntarily directed.

  14. Neural correlates of an early attentional capture by positive distractor words

    Directory of Open Access Journals (Sweden)

    José Antonio Hinojosa

    2015-01-01

    Full Text Available Exogenous or automatic attention to emotional distractors has been observed for emotional scenes and faces. In the language domain, however, automatic attention capture by emotional words has been scarcely investigated. In the current event-related potentials study we explored distractor effects elicited by positive, negative and neutral words in a concurrent but distinct target distractor paradigm. Specifically, participants performed a digit categorization task in which task-irrelevant words were flanked by numbers. The results of both temporo-spatial principal component and source location analyses revealed the existence of early distractor effects that were specifically triggered by positive words. At the scalp level, task-irrelevant positive compared to neutral and negative words elicited larger amplitudes in an anterior negative component that peaked around 120 ms. Also, at the voxel level, positive distractor words increased activity in orbitofrontal regions compared to negative words. These results suggest that positive distractor words quickly and automatically capture attentional resources diverting them from the task where attention was voluntarily directed.

  15. Voice Quality Measuring Setup with Automatic Voice over IP Call Generator and Lawful Interception Packet Analyzer

    Directory of Open Access Journals (Sweden)

    PLEVA Matus

    Full Text Available This paper describes the packet measuring laboratory setup, which could be used also for lawful interception applications, using professional packet analyzer, Voice over IP call generator, free call server (Asterisk linux setup and appropriate software and hardware described below. This setup was used for measuring the quality of the automatically generated VoIP calls under stressed network conditions, when the call manager server was flooded with high bandwidth traffic, near the bandwidth limit of the connected switch. The call generator realizes 30 calls simultaneously and the packet capturer & analyzercould decode the VoIP traffic, extract RTP session data, automatically analyze the voice quality using standardized MOS (Mean Opinion Score values and describe also the source of the voice degradation (jitter, packet loss, codec, delay, etc..

  16. USING AFFORDABLE DATA CAPTURING DEVICES FOR AUTOMATIC 3D CITY MODELLING

    Directory of Open Access Journals (Sweden)

    B. Alizadehashrafi

    2017-11-01

    Full Text Available In this research project, many movies from UTM Kolej 9, Skudai, Johor Bahru (See Figure 1 were taken by AR. Drone 2. Since the AR drone 2.0 has liquid lens, while flying there were significant distortions and deformations on the converted pictures of the movies. Passive remote sensing (RS applications based on image matching and Epipolar lines such as Agisoft PhotoScan have been tested to create the point clouds and mesh along with 3D models and textures. As the result was not acceptable (See Figure 2, the previous Dynamic Pulse Function based on Ruby programming language were enhanced and utilized to create the 3D models automatically in LoD3. The accuracy of the final 3D model is almost 10 to 20 cm. After rectification and parallel projection of the photos based on some tie points and targets, all the parameters were measured and utilized as an input to the system to create the 3D model automatically in LoD3 in a very high accuracy.

  17. Using Affordable Data Capturing Devices for Automatic 3d City Modelling

    Science.gov (United States)

    Alizadehashrafi, B.; Abdul-Rahman, A.

    2017-11-01

    In this research project, many movies from UTM Kolej 9, Skudai, Johor Bahru (See Figure 1) were taken by AR. Drone 2. Since the AR drone 2.0 has liquid lens, while flying there were significant distortions and deformations on the converted pictures of the movies. Passive remote sensing (RS) applications based on image matching and Epipolar lines such as Agisoft PhotoScan have been tested to create the point clouds and mesh along with 3D models and textures. As the result was not acceptable (See Figure 2), the previous Dynamic Pulse Function based on Ruby programming language were enhanced and utilized to create the 3D models automatically in LoD3. The accuracy of the final 3D model is almost 10 to 20 cm. After rectification and parallel projection of the photos based on some tie points and targets, all the parameters were measured and utilized as an input to the system to create the 3D model automatically in LoD3 in a very high accuracy.

  18. An automatic, stagnation point based algorithm for the delineation of Wellhead Protection Areas

    Science.gov (United States)

    Tosco, Tiziana; Sethi, Rajandrea; di Molfetta, Antonio

    2008-07-01

    Time-related capture areas are usually delineated using the backward particle tracking method, releasing circles of equally spaced particles around each well. In this way, an accurate delineation often requires both a very high number of particles and a manual capture zone encirclement. The aim of this work was to propose an Automatic Protection Area (APA) delineation algorithm, which can be coupled with any model of flow and particle tracking. The computational time is here reduced, thanks to the use of a limited number of nonequally spaced particles. The particle starting positions are determined coupling forward particle tracking from the stagnation point, and backward particle tracking from the pumping well. The pathlines are postprocessed for a completely automatic delineation of closed perimeters of time-related capture zones. The APA algorithm was tested for a two-dimensional geometry, in homogeneous and nonhomogeneous aquifers, steady state flow conditions, single and multiple wells. Results show that the APA algorithm is robust and able to automatically and accurately reconstruct protection areas with a very small number of particles, also in complex scenarios.

  19. Managing Returnable Containers Logistics - A Case Study Part II - Improving Visibility through Using Automatic Identification Technologies

    Directory of Open Access Journals (Sweden)

    Gretchen Meiser

    2011-05-01

    Full Text Available This case study is the result of a project conducted on behalf of a company that uses its own returnable containers to transport purchased parts from suppliers. The objective of this project was to develop a proposal to enable the company to more effectively track and manage its returnable containers. The research activities in support of this project included (1 the analysis and documentation of the physical flow and the information flow associated with the containers and (2 the investigation of new technologies to improve the automatic identification and tracking of containers. This paper explains the automatic identification technologies and important criteria for selection. A companion paper details the flow of information and containers within the logistics chain, and it identifies areas for improving the management of the containers.

  20. The Impact of Lecture Capture Presentations in a Distributed Learning Environment in Parks, Recreation, and Tourism Management

    Science.gov (United States)

    Vassar, Penny; Havice, Pamela A.; Havice, William L.; Brookover, Robert, IV

    2015-01-01

    Lecture capture technology allows instructors to record presentations and make them available to their students digitally. This study examined one program's implementation of lecture capture. Participants were undergraduate college students enrolled in Parks, Recreation, and Tourism Management courses at a public land grant university in the…

  1. Attention capture by contour onsets and offsets: no special role for onsets.

    Science.gov (United States)

    Watson, D G; Humphreys, G W

    1995-07-01

    In five experiments, we investigated the power of targets defined by the onset or offset of one of an object's parts (contour onsets and offsets) either to guide or to capture visual attention. In Experiment 1, search for a single contour onset target was compared with search for a single contour offset target against a static background of distractors; no difference was found between the efficiency with which each could be detected. In Experiment 2, onsets and offsets were compared for automatic attention capture, when both occurred simultaneously. Unlike in previous studies, the effects of overall luminance change, new-object creation, and number of onset and offset items were controlled. It was found that contour onset and offset items captured attention equally well. However, display size effects on both target types were also apparent. Such effects may have been due to competition for selection between multiple onset and offset stimuli. In Experiments 3 and 4, single onset and offset stimuli were presented simultaneously and pitted directly against one another among a background of static distractors. In Experiment 3, we examined "guided search," for a target that was formed either from an onset or from an offset among static items. In Experiment 4, the onsets and offsets were uncorrelated with the target location. Similar results occurred in both experiments: target onsets and offsets were detected more efficiently than static stimuli which needed serial search; there remained effects of display size on performance; but there was still no advantage for onsets. In Experiment 5, we examined automatic attention capture by single onset and offset stimuli presented individually among static distractors. Again, there was no advantage for onset over offset targets and a display size effect was also present. These results suggest that, both in isolation and in competition, onsets that do not form new objects neither guide nor gain automatic attention more efficiently

  2. A model to capture and manage tacit knowledge using a multiagent system

    Science.gov (United States)

    Paolino, Lilyam; Paggi, Horacio; Alonso, Fernando; López, Genoveva

    2014-10-01

    This article presents a model to capture and register business tacit knowledge belonging to different sources, using an expert multiagent system which enables the entry of incidences and captures the tacit knowledge which could fix them. This knowledge and their sources are evaluated through the application of trustworthy algorithms that lead to the registration of the data base and the best of each of them. Through its intelligent software agents, this system interacts with the administrator, users, with the knowledge sources and with all the practice communities which might exist in the business world. The sources as well as the knowledge are constantly evaluated, before being registered and also after that, in order to decide the staying or modification of its original weighting. If there is the possibility of better, new knowledge are registered through the old ones. This is also part of an investigation being carried out which refers to knowledge management methodologies in order to manage tacit business knowledge so as to make the business competitiveness easier and leading to innovation learning.

  3. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  4. Neural correlates of an early attentional capture by positive distractor words

    OpenAIRE

    Hinojosa, Jos? A.; Mercado, Francisco; Albert, Jacobo; Barjola, Paloma; Pel?ez, Irene; Villalba-Garc?a, Cristina; Carreti?, Luis

    2015-01-01

    Exogenous or automatic attention to emotional distractors has been observed for emotional scenes and faces. In the language domain, however, automatic attention capture by emotional words has been scarcely investigated. In the current event-related potentials study we explored distractor effects elicited by positive, negative and neutral words in a concurrent but distinct target distractor paradigm. Specifically, participants performed a digit categorization task in which task-irrelevant words...

  5. Neural correlates of an early attentional capture by positive distractor words

    OpenAIRE

    José Antonio Hinojosa; Francisco eMercado; Jacobo eAlbert; Jacobo eAlbert; Paloma eBarjola; Irene ePeláez; Cristina eVillalba-García; Luis eCarretié

    2015-01-01

    Exogenous or automatic attention to emotional distractors has been observed for emotional scenes and faces. In the language domain, however, automatic attention capture by emotional words has been scarcely investigated. In the current event-related potentials study we explored distractor effects elicited by positive, negative and neutral words in a concurrent but distinct target distractor paradigm. Specifically, participants performed a digit categorization task in which task-irrelevant wor...

  6. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  7. Task-irrelevant own-race faces capture attention: eye-tracking evidence.

    Science.gov (United States)

    Cao, Rong; Wang, Shuzhen; Rao, Congquan; Fu, Jia

    2013-04-01

    To investigate attentional capture by face's race, the current study recorded saccade latencies of eye movement measurements in an inhibition of return (IOR) task. Compared to Caucasian (other-race) faces, Chinese (own-race) faces elicited longer saccade latency. This phenomenon disappeared when faces were inverted. The results indicated that own-race faces capture attention automatically with high-level configural processing. © 2013 The Authors. Scandinavian Journal of Psychology © 2013 The Scandinavian Psychological Associations.

  8. A case of malignant hyperthermia captured by an anesthesia information management system.

    Science.gov (United States)

    Maile, Michael D; Patel, Rajesh A; Blum, James M; Tremper, Kevin K

    2011-04-01

    Many cases of malignant hyperthermia triggered by volatile anesthetic agents have been described. However, to our knowledge, there has not been a report describing the precise changes in physiologic data of a human suffering from this process. Here we describe a case of malignant hyperthermia in which monitoring information was frequently and accurately captured by an anesthesia information management system.

  9. A Survey of Advances in Vision-Based Human Motion Capture and Analysis

    DEFF Research Database (Denmark)

    Moeslund, Thomas B.; Hilton, Adrian; Krüger, Volker

    2006-01-01

    This survey reviews advances in human motion capture and analysis from 2000 to 2006, following a previous survey of papers up to 2000 Human motion capture continues to be an increasingly active research area in computer vision with over 350 publications over this period. A number of significant...... actions and behavior. This survey reviews recent trends in video based human capture and analysis, as well as discussing open problems for future research to achieve automatic visual analysis of human movement....

  10. Automatic guidance of attention during real-world visual search.

    Science.gov (United States)

    Seidl-Rathkopf, Katharina N; Turk-Browne, Nicholas B; Kastner, Sabine

    2015-08-01

    Looking for objects in cluttered natural environments is a frequent task in everyday life. This process can be difficult, because the features, locations, and times of appearance of relevant objects often are not known in advance. Thus, a mechanism by which attention is automatically biased toward information that is potentially relevant may be helpful. We tested for such a mechanism across five experiments by engaging participants in real-world visual search and then assessing attentional capture for information that was related to the search set but was otherwise irrelevant. Isolated objects captured attention while preparing to search for objects from the same category embedded in a scene, as revealed by lower detection performance (Experiment 1A). This capture effect was driven by a central processing bottleneck rather than the withdrawal of spatial attention (Experiment 1B), occurred automatically even in a secondary task (Experiment 2A), and reflected enhancement of matching information rather than suppression of nonmatching information (Experiment 2B). Finally, attentional capture extended to objects that were semantically associated with the target category (Experiment 3). We conclude that attention is efficiently drawn towards a wide range of information that may be relevant for an upcoming real-world visual search. This mechanism may be adaptive, allowing us to find information useful for our behavioral goals in the face of uncertainty.

  11. Automatic guidance of attention during real-world visual search

    Science.gov (United States)

    Seidl-Rathkopf, Katharina N.; Turk-Browne, Nicholas B.; Kastner, Sabine

    2015-01-01

    Looking for objects in cluttered natural environments is a frequent task in everyday life. This process can be difficult, as the features, locations, and times of appearance of relevant objects are often not known in advance. A mechanism by which attention is automatically biased toward information that is potentially relevant may thus be helpful. Here we tested for such a mechanism across five experiments by engaging participants in real-world visual search and then assessing attentional capture for information that was related to the search set but was otherwise irrelevant. Isolated objects captured attention while preparing to search for objects from the same category embedded in a scene, as revealed by lower detection performance (Experiment 1A). This capture effect was driven by a central processing bottleneck rather than the withdrawal of spatial attention (Experiment 1B), occurred automatically even in a secondary task (Experiment 2A), and reflected enhancement of matching information rather than suppression of non-matching information (Experiment 2B). Finally, attentional capture extended to objects that were semantically associated with the target category (Experiment 3). We conclude that attention is efficiently drawn towards a wide range of information that may be relevant for an upcoming real-world visual search. This mechanism may be adaptive, allowing us to find information useful for our behavioral goals in the face of uncertainty. PMID:25898897

  12. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  13. Terminal area automatic navigation, guidance and control research using the Microwave Landing System (MLS). Part 5: Design and development of a Digital Integrated Automatic Landing System (DIALS) for steep final approach using modern control techniques

    Science.gov (United States)

    Halyo, N.

    1983-01-01

    The design and development of a 3-D Digital Integrated Automatic Landing System (DIALS) for the Terminal Configured Vehicle (TCV) Research Aircraft, a B-737-100 is described. The system was designed using sampled data Linear Quadratic Gaussian (LOG) methods, resulting in a direct digital design with a modern control structure which consists of a Kalman filter followed by a control gain matrix, all operating at 10 Hz. DIALS uses Microwave Landing System (MLS) position, body-mounted accelerometers, as well as on-board sensors usually available on commercial aircraft, but does not use inertial platforms. The phases of the final approach considered are the localizer and glideslope capture which may be performed simultaneously, localizer and steep glideslope track or hold, crab/decrab and flare to touchdown. DIALS captures, tracks and flares from steep glideslopes ranging from 2.5 deg to 5.5 deg, selected prior to glideslope capture. Digital Integrated Automatic Landing System is the first modern control design automatic landing system successfully flight tested. The results of an initial nonlinear simulation are presented here.

  14. Algorithms for the automatic identification of MARFEs and UFOs in JET database of visible camera videos

    International Nuclear Information System (INIS)

    Murari, A.; Camplani, M.; Cannas, B.; Usai, P.; Mazon, D.; Delaunay, F.

    2010-01-01

    MARFE instabilities and UFOs leave clear signatures in JET fast visible camera videos. Given the potential harmful consequences of these events, particularly as triggers of disruptions, it would be important to have the means of detecting them automatically. In this paper, the results of various algorithms to identify automatically the MARFEs and UFOs in JET visible videos are reported. The objective is to retrieve the videos, which have captured these events, exploring the whole JET database of images, as a preliminary step to the development of real-time identifiers in the future. For the detection of MARFEs, a complete identifier has been finalized, using morphological operators and Hu moments. The final algorithm manages to identify the videos with MARFEs with a success rate exceeding 80%. Due to the lack of a complete statistics of examples, the UFO identifier is less developed, but a preliminary code can detect UFOs quite reliably. (authors)

  15. KIT/KPS of Qinshan phase-II and a discussion on integrated information management and automatic control

    International Nuclear Information System (INIS)

    Yan Changhui

    2001-01-01

    Centralized Data Processing and Safety Panel (KIT/KPS) of Qinshan Phase-II power project is described, and the necessity and engineering scheme is presented of integrated information management and automatic control that would achieve in power plant according to the technology scheme and technology trait of KIT/KPS

  16. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  17. Data Capture Technique for High Speed Signaling

    Science.gov (United States)

    Barrett, Wayne Melvin; Chen, Dong; Coteus, Paul William; Gara, Alan Gene; Jackson, Rory; Kopcsay, Gerard Vincent; Nathanson, Ben Jesse; Vranas, Paylos Michael; Takken, Todd E.

    2008-08-26

    A data capture technique for high speed signaling to allow for optimal sampling of an asynchronous data stream. This technique allows for extremely high data rates and does not require that a clock be sent with the data as is done in source synchronous systems. The present invention also provides a hardware mechanism for automatically adjusting transmission delays for optimal two-bit simultaneous bi-directional (SiBiDi) signaling.

  18. S-ketamine influences strategic allocation of attention but not exogenous capture of attention.

    Science.gov (United States)

    Fuchs, Isabella; Ansorge, Ulrich; Huber-Huber, Christoph; Höflich, Anna; Lanzenberger, Rupert

    2015-09-01

    We investigated whether s-ketamine differentially affects strategic allocation of attention. In Experiment 1, (1) a less visible cue was weakly masked by the onsets of competing placeholders or (2) a better visible cue was not masked because it was presented in isolation. Both types of cue appeared more often opposite of the target (75%) than at target position (25%). With this setup, we tested for strategic attention shifts to the opposite side of the cues and for exogenous attentional capture toward the cue's side in a short cue-target interval, as well as for (reverse) cueing effects in a long cue-target interval after s-ketamine and after placebo treatment in a double-blind within-participant design. We found reduced strategic attention shifts after cues presented without placeholders for the s-ketamine compared to the placebo treatment in the short interval, indicating an early effect on the strategic allocation of attention. No differences between the two treatments were found for exogenous attentional capture by less visible cues, suggesting that s-ketamine does not affect exogenous attentional capture in the presence of competing distractors. Experiment 2 confirmed that the competing onsets of the placeholders prevented the strategic cueing effect. Taken together, the results indicate that s-ketamine affects strategic attentional capture, but not exogenous attentional capture. The findings point to a more prominent role of s-ketamine during top-down controlled forms of attention that require suppression of automatic capture than during automatic capture itself. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Automatic parallelization of while-Loops using speculative execution

    International Nuclear Information System (INIS)

    Collard, J.F.

    1995-01-01

    Automatic parallelization of imperative sequential programs has focused on nests of for-loops. The most recent of them consist in finding an affine mapping with respect to the loop indices to simultaneously capture the temporal and spatial properties of the parallelized program. Such a mapping is usually called a open-quotes space-time transformation.close quotes This work describes an extension of these techniques to while-loops using speculative execution. We show that space-time transformations are a good framework for summing up previous restructuration techniques of while-loop, such as pipelining. Moreover, we show that these transformations can be derived and applied automatically

  20. Knowledge management: an analysis of the tools of expert knowledge capture

    International Nuclear Information System (INIS)

    Fernandez Larcher, A.

    2009-01-01

    This work proposes the revision of the strategies and tools used to elicit and capture expert knowledge, particularly those suggested by the EPRI and the IAEA. The main objective of this paper consists of examining the effectiveness and scope of the methodologies proposed, in order to apply them and make them suitable according to our institutional context. This article emphasizes the value and usefulness of the interview's methods with the aim of implementing some of them to the activities created and organized by CNEA Nuclear Knowledge Management Group, especially to the ConRRad Project. (author)

  1. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    Science.gov (United States)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  2. Unconscious Attentional Capture Effect Can be Induced by Perceptual Learning

    Directory of Open Access Journals (Sweden)

    Zhe Qu

    2011-05-01

    Full Text Available Previous ERP studies have shown that N2pc serves as an index for salient stimuli that capture attention, even if they are task irrelevant. This study aims to investigate whether nonsalient stimuli can capture attention automatically and unconsciously after perceptual learning. Adult subjects were trained with a visual search task for eight to ten sessions. The training task was to detect whether the target (triangle with one particular direction was present or not. After training, an ERP session was performed, in which subjects were required to detect the presence of either the trained triangle (i.e., the target triangle in the training sessions or an untrained triangle. Results showed that, while the untrained triangle did not elicit an N2pc effect, the trained triangle elicited a significant N2pc effect regardless of whether it was perceived correctly or not, even when it was task irrelevant. Moreover, the N2pc effect for the trained triangle was completely retained 3 months later. These results suggest that, after perceptual learning, previously unsalient stimuli become more salient and can capture attention automatically and unconsciously. Once the facilitating process of the unsalient stimulus has been built up in the brain, it can last for a long time.

  3. Attention capture by abrupt onsets: re-visiting the priority tag model

    OpenAIRE

    Meera Mary Sunny; Adrian evon Muhlenen

    2013-01-01

    Abrupt onsets have been shown to strongly attract attention in a stimulus-driven, bottom-up manner. However, the precise mechanism that drives capture by onsets is still debated. According to the new object account, abrupt onsets capture attention because they signal the appearance of a new object. Yantis and Johnson (1990) used a visual search task and showed that up to four onsets can be automatically prioritized. However, in their study the number of onsets co-varied with the total number ...

  4. Automatic digitization. Experience of magnum 8000 in automatic digitization in EA; Digitalizacion automatica. Experiencias obtenidas durante la utilizacion del sistema magnus 8000 para la digitalizacion automatica en EA

    Energy Technology Data Exchange (ETDEWEB)

    Munoz Garcia, M.

    1995-12-31

    The paper describes the life cycle to be followed for the automatic digitization of files containing rasterised (scanned) images for their conversion into vector files (processable using CAD tools). The main characteristics of each of the five phases: capture, cleaning, conversion, revision and post-processing, that form part of the life cycle, are described. Lastly, the paper gives a comparative analysis of the results obtained using the automatic digitization process and other more conventional methods. (Author)

  5. A preliminary architecture for building communication software from traffic captures

    Science.gov (United States)

    Acosta, Jaime C.; Estrada, Pedro

    2017-05-01

    Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.

  6. Energy Management for Automatic Monitoring Stations in Arctic Regions

    Science.gov (United States)

    Pimentel, Demian

    Automatic weather monitoring stations deployed in arctic regions are usually installed in hard to reach locations. Most of the time they run unsupervised and they face severe environmental conditions: very low temperatures, ice riming, etc. It is usual practice to use a local energy source to power the equipment. There are three main ways to achieve this: (1) a generator whose fuel has to be transported to the location at regular intervals (2) a battery and (3) an energy harvesting generator that exploits a local energy source. Hybrid systems are very common. Polar nights and long winters are typical of arctic regions. Solar radiation reaching the ground during this season is very low or non-existent, depending on the geographical location. Therefore, solar power generation is not very effective. One straightforward, but expensive and inefficient solution is the use of a large bank of batteries that is recharged during sunny months and discharged during the winter. The main purpose of the monitoring stations is to collect meteorological data at regular intervals; interruptions due to a lack of electrical energy can be prevented with the use of an energy management subsystem. Keeping a balance between incoming and outgoing energy flows, while assuring the continuous operation of the station, is the delicate task of energy management strategies. This doctoral thesis explores alternate power generation solutions and intelligent energy management techniques for equipment deployed in the arctic. For instance, harvesting energy from the wind to complement solar generation is studied. Nevertheless, harvested energy is a scarce resource and needs to be used efficiently. Genetic algorithms, fuzzy logic, and common sense are used to efficiently manage energy flows within a simulated arctic weather station.

  7. Automatic Keyframe Summarization of User-Generated Video

    Science.gov (United States)

    2014-06-01

    over longer periods of space and time. Additionally, the storyline may be less crafted or coherent when compared to professional cinema . As such, shot...attention in videos, whether it be their presence, location, identity , actions, or relationships to other humans. In this regard, automatic human capture...among other things. A person AOC has an identity property. Properties of an AOC that a stakeholder considers important are called POCs. 3.1.3

  8. Automatic Tuning of the Superheat Controller in a Refrigeration Plant

    DEFF Research Database (Denmark)

    Rasmussen, Henrik; Thybo, Claus; Larsen, Lars F. S.

    2006-01-01

    This paper proposes an automatic tuning of the superheat control in a refrigeration system using a relay method. By means of a simple evaporator model that captures the important dynamics and non-linearities of the superheat a gain-scheduling that compensates for the variation of the process gain...

  9. Attention capture by abrupt onsets: re-visiting the priority tag model

    Directory of Open Access Journals (Sweden)

    Meera Mary Sunny

    2013-12-01

    Full Text Available Abrupt onsets have been shown to strongly attract attention in a stimulus-driven, bottom-up manner. However, the precise mechanism that drives capture by onsets is still debated. According to the new object account, abrupt onsets capture attention because they signal the appearance of a new object. Yantis and Johnson (1990 used a visual search task and showed that up to four onsets can be automatically prioritized. However, in their study the number of onsets co-varied with the total number of items in the display, allowing for a possible confound between these two variables. In the present study, display size was fixed at eight items while the number of onsets was systematically varied between zero and eight. Experiment 1 showed a systematic increase in reactions times with increasing number of onsets. This increase was stronger when the target was an onset than when it was a no-onset item, a result that is best explained by a model according to which only one onset is automatically prioritized. Even when the onsets were marked in red (Experiment 2, nearly half of the participants continued to prioritize only one onset item. Only when onset and no-onset targets were blocked (Experiment 3, participants started to search selectively through the set of only the relevant target type. These results further support the finding that only one onset captures attention. Many bottom-up models of attention capture, like masking or saliency accounts, can efficiently explain this finding.

  10. Attention capture by abrupt onsets: re-visiting the priority tag model.

    Science.gov (United States)

    Sunny, Meera M; von Mühlenen, Adrian

    2013-01-01

    Abrupt onsets have been shown to strongly attract attention in a stimulus-driven, bottom-up manner. However, the precise mechanism that drives capture by onsets is still debated. According to the new object account, abrupt onsets capture attention because they signal the appearance of a new object. Yantis and Johnson (1990) used a visual search task and showed that up to four onsets can be automatically prioritized. However, in their study the number of onsets co-varied with the total number of items in the display, allowing for a possible confound between these two variables. In the present study, display size was fixed at eight items while the number of onsets was systematically varied between zero and eight. Experiment 1 showed a systematic increase in reactions times with increasing number of onsets. This increase was stronger when the target was an onset than when it was a no-onset item, a result that is best explained by a model according to which only one onset is automatically prioritized. Even when the onsets were marked in red (Experiment 2), nearly half of the participants continued to prioritize only one onset item. Only when onset and no-onset targets were blocked (Experiment 3), participants started to search selectively through the set of only the relevant target type. These results further support the finding that only one onset captures attention. Many bottom-up models of attention capture, like masking or saliency accounts, can efficiently explain this finding.

  11. Radar automatic target recognition (ATR) and non-cooperative target recognition (NCTR)

    CERN Document Server

    Blacknell, David

    2013-01-01

    The ability to detect and locate targets by day or night, over wide areas, regardless of weather conditions has long made radar a key sensor in many military and civil applications. However, the ability to automatically and reliably distinguish different targets represents a difficult challenge. Radar Automatic Target Recognition (ATR) and Non-Cooperative Target Recognition (NCTR) captures material presented in the NATO SET-172 lecture series to provide an overview of the state-of-the-art and continuing challenges of radar target recognition. Topics covered include the problem as applied to th

  12. Do I Have My Attention? Speed of Processing Advantages for the Self-Face Are Not Driven by Automatic Attention Capture

    Science.gov (United States)

    Keyes, Helen; Dlugokencka, Aleksandra

    2014-01-01

    We respond more quickly to our own face than to other faces, but there is debate over whether this is connected to attention-grabbing properties of the self-face. In two experiments, we investigate whether the self-face selectively captures attention, and the attentional conditions under which this might occur. In both experiments, we examined whether different types of face (self, friend, stranger) provide differential levels of distraction when processing self, friend and stranger names. In Experiment 1, an image of a distractor face appeared centrally – inside the focus of attention – behind a target name, with the faces either upright or inverted. In Experiment 2, distractor faces appeared peripherally – outside the focus of attention – in the left or right visual field, or bilaterally. In both experiments, self-name recognition was faster than other name recognition, suggesting a self-referential processing advantage. The presence of the self-face did not cause more distraction in the naming task compared to other types of face, either when presented inside (Experiment 1) or outside (Experiment 2) the focus of attention. Distractor faces had different effects across the two experiments: when presented inside the focus of attention (Experiment 1), self and friend images facilitated self and friend naming, respectively. This was not true for stranger stimuli, suggesting that faces must be robustly represented to facilitate name recognition. When presented outside the focus of attention (Experiment 2), no facilitation occurred. Instead, we report an interesting distraction effect caused by friend faces when processing strangers’ names. We interpret this as a “social importance” effect, whereby we may be tuned to pick out and pay attention to familiar friend faces in a crowd. We conclude that any speed of processing advantages observed in the self-face processing literature are not driven by automatic attention capture. PMID:25338170

  13. Invited review: The impact of automatic milking systems on dairy cow management, behavior, health, and welfare.

    Science.gov (United States)

    Jacobs, J A; Siegford, J M

    2012-05-01

    Over the last 100 yr, the dairy industry has incorporated technology to maximize yield and profit. Pressure to maximize efficiency and lower inputs has resulted in novel approaches to managing and milking dairy herds, including implementation of automatic milking systems (AMS) to reduce labor associated with milking. Although AMS have been used for almost 20 yr in Europe, they have only recently become more popular in North America. Automatic milking systems have the potential to increase milk production by up to 12%, decrease labor by as much as 18%, and simultaneously improve dairy cow welfare by allowing cows to choose when to be milked. However, producers using AMS may not fully realize these anticipated benefits for a variety of reasons. For example, producers may not see a reduction in labor because some cows do not milk voluntarily or because they have not fully or efficiently incorporated the AMS into their management routines. Following the introduction of AMS on the market in the 1990s, research has been conducted examining AMS systems versus conventional parlors focusing primarily on cow health, milk yield, and milk quality, as well as on some of the economic and social factors related to AMS adoption. Additionally, because AMS rely on cows milking themselves voluntarily, research has also been conducted on the behavior of cows in AMS facilities, with particular attention paid to cow traffic around AMS, cow use of AMS, and cows' motivation to enter the milking stall. However, the sometimes contradictory findings resulting from different studies on the same aspect of AMS suggest that differences in management and farm-level variables may be more important to AMS efficiency and milk production than features of the milking system itself. Furthermore, some of the recommendations that have been made regarding AMS facility design and management should be scientifically tested to demonstrate their validity, as not all may work as intended. As updated AMS

  14. The Role of Automatic Obesity Stereotypes in Real Hiring Discrimination

    Science.gov (United States)

    Agerstrom, Jens; Rooth, Dan-Olof

    2011-01-01

    This study examined whether automatic stereotypes captured by the implicit association test (IAT) can predict real hiring discrimination against the obese. In an unobtrusive field experiment, job applications were sent to a large number of real job vacancies. The applications were matched on credentials but differed with respect to the applicant's…

  15. Improved Techniques for Automatic Chord Recognition from Music Audio Signals

    Science.gov (United States)

    Cho, Taemin

    2014-01-01

    This thesis is concerned with the development of techniques that facilitate the effective implementation of capable automatic chord transcription from music audio signals. Since chord transcriptions can capture many important aspects of music, they are useful for a wide variety of music applications and also useful for people who learn and perform…

  16. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  17. Musculoskeletal Simulation Model Generation from MRI Data Sets and Motion Capture Data

    Science.gov (United States)

    Schmid, Jérôme; Sandholm, Anders; Chung, François; Thalmann, Daniel; Delingette, Hervé; Magnenat-Thalmann, Nadia

    Today computer models and computer simulations of the musculoskeletal system are widely used to study the mechanisms behind human gait and its disorders. The common way of creating musculoskeletal models is to use a generic musculoskeletal model based on data derived from anatomical and biomechanical studies of cadaverous specimens. To adapt this generic model to a specific subject, the usual approach is to scale it. This scaling has been reported to introduce several errors because it does not always account for subject-specific anatomical differences. As a result, a novel semi-automatic workflow is proposed that creates subject-specific musculoskeletal models from magnetic resonance imaging (MRI) data sets and motion capture data. Based on subject-specific medical data and a model-based automatic segmentation approach, an accurate modeling of the anatomy can be produced while avoiding the scaling operation. This anatomical model coupled with motion capture data, joint kinematics information, and muscle-tendon actuators is finally used to create a subject-specific musculoskeletal model.

  18. The use of automatic programming techniques for fault tolerant computing systems

    Science.gov (United States)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  19. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  20. Automatic recognition of touch gestures in the corpus of social touch

    NARCIS (Netherlands)

    Jung, Merel Madeleine; Poel, Mannes; Poppe, Ronald Walter; Heylen, Dirk K.J.

    For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different

  1. Automatic focusing of attention on object size and shape

    Directory of Open Access Journals (Sweden)

    Cesar Galera

    2005-01-01

    Full Text Available In two experiments we investigated the automatic adjusting of the attentional focus to simple geometric shapes. The participants performed a visual search task with four stimuli (the target and three distractors presented always around the fixation point, inside an outlined frame not related to the search task. A cue informed the subject only about the possible size and shape of the frame, not about the target. The results of the first experiment showed faster target detection in the valid cue trials, suggesting that attention was captured automatically by the cue shape. In the second experiment, we introduced a flanker stimulus (compatible or incompatible with the target in order to determine if attentional resources spread homogenously inside and outside the frame. The results showed that performance depended both on cue validity and frame orientation. The flanker effect was dependent on compatibility and flanker position (vertical or horizontal meridian. The results of both experiments suggest that the form of an irrelevant object can capture attention despite participants’ intention and the results of the second experiment suggest that the attentional resources are more concentrated along the horizontal meridian.

  2. Methodology for Automatic Ontology Generation Using Database Schema Information

    Directory of Open Access Journals (Sweden)

    JungHyen An

    2018-01-01

    Full Text Available An ontology is a model language that supports the functions to integrate conceptually distributed domain knowledge and infer relationships among the concepts. Ontologies are developed based on the target domain knowledge. As a result, methodologies to automatically generate an ontology from metadata that characterize the domain knowledge are becoming important. However, existing methodologies to automatically generate an ontology using metadata are required to generate the domain metadata in a predetermined template, and it is difficult to manage data that are increased on the ontology itself when the domain OWL (Ontology Web Language individuals are continuously increased. The database schema has a feature of domain knowledge and provides structural functions to efficiently process the knowledge-based data. In this paper, we propose a methodology to automatically generate ontologies and manage the OWL individual through an interaction of the database and the ontology. We describe the automatic ontology generation process with example schema and demonstrate the effectiveness of the automatically generated ontology by comparing it with existing ontologies using the ontology quality score.

  3. [Development of automatic urine monitoring system].

    Science.gov (United States)

    Wei, Liang; Li, Yongqin; Chen, Bihua

    2014-03-01

    An automatic urine monitoring system is presented to replace manual operation. The system is composed of the flow sensor, MSP430f149 single chip microcomputer, human-computer interaction module, LCD module, clock module and memory module. The signal of urine volume is captured when the urine flows through the flow sensor and then displayed on the LCD after data processing. The experiment results suggest that the design of the monitor provides a high stability, accurate measurement and good real-time, and meets the demand of the clinical application.

  4. Tele-healthcare for diabetes management: A low cost automatic approach.

    Science.gov (United States)

    Benaissa, M; Malik, B; Kanakis, A; Wright, N P

    2012-01-01

    In this paper, a telemedicine system for managing diabetic patients with better care is presented. The system is an end to end solution which relies on the integration of front end (patient unit) and backend web server. A key feature of the system developed is the very low cost automated approach. The front-end of the system is capable of reading glucose measurements from any glucose meter and sending them automatically via existing networks to the back-end server. The back-end is designed and developed using n-tier web client architecture based on model-view-controller design pattern using open source technology, a cost effective solution. The back-end helps the health-care provider with data analysis; data visualization and decision support, and allows them to send feedback and therapeutic advice to patients from anywhere using a browser enabled device. This system will be evaluated during the trials which will be conducted in collaboration with a local hospital in phased manner.

  5. Automatic management system for dose parameters in interventional radiology and cardiology

    International Nuclear Information System (INIS)

    Ten, J. I.; Fernandez, J. M.; Vano, E.

    2011-01-01

    The purpose of this work was to develop an automatic management system to archive and analyse the major study parameters and patient doses for fluoroscopy guided procedures performed in cardiology and interventional radiology systems. The X-ray systems used for this trial have the capability to export at the end of the procedure and via e-mail the technical parameters of the study and the patient dose values. An application was developed to query and retrieve from a mail server, all study reports sent by the imaging modality and store them on a Microsoft SQL Server data base. The results from 3538 interventional study reports generated by 7 interventional systems were processed. In the case of some technical parameters and patient doses, alarms were added to receive malfunction alerts so as to immediately take appropriate corrective actions. (authors)

  6. Automatic management system for dose parameters in interventional radiology and cardiology.

    Science.gov (United States)

    Ten, J I; Fernandez, J M; Vaño, E

    2011-09-01

    The purpose of this work was to develop an automatic management system to archive and analyse the major study parameters and patient doses for fluoroscopy guided procedures performed in cardiology and interventional radiology systems. The X-ray systems used for this trial have the capability to export at the end of the procedure and via e-mail the technical parameters of the study and the patient dose values. An application was developed to query and retrieve from a mail server, all study reports sent by the imaging modality and store them on a Microsoft SQL Server data base. The results from 3538 interventional study reports generated by 7 interventional systems were processed. In the case of some technical parameters and patient doses, alarms were added to receive malfunction alerts so as to immediately take appropriate corrective actions.

  7. Improve data integration performance by employing metadata management utility

    Energy Technology Data Exchange (ETDEWEB)

    Wei, M.; Sung, A.H. [New Mexico Petroleum Recovery Research Center, Socorro, NM (United States)

    2005-07-01

    This conference paper explored ways of integrating petroleum and exploration data obtained from different sources in order to provide more comprehensive data for various analysis purposes and to improve the integrity and consistency of integrated data. This paper proposes a methodology to enhance oil and gas industry data integration performance by cooperating data management utilities in Microsoft SQL Server database management system (DBMS) for small scale data integration without support of commercial software. By semi-automatically capturing metadata, data sources are investigated in detail, data quality problems are partially cleansed, and the performance of data integration is improved. 20 refs., 7 tabs., 1 fig.

  8. Atmospheric Radiation Measurement's Data Management Facility captures metadata and uses visualization tools to assist in routine data management.

    Science.gov (United States)

    Keck, N. N.; Macduff, M.; Martin, T.

    2017-12-01

    The Atmospheric Radiation Measurement's (ARM) Data Management Facility (DMF) plays a critical support role in processing and curating data generated by the Department of Energy's ARM Program. Data are collected near real time from hundreds of observational instruments spread out all over the globe. Data are then ingested hourly to provide time series data in NetCDF (network Common Data Format) and includes standardized metadata. Based on automated processes and a variety of user reviews the data may need to be reprocessed. Final data sets are then stored and accessed by users through the ARM Archive. Over the course of 20 years, a suite of data visualization tools have been developed to facilitate the operational processes to manage and maintain the more than 18,000 real time events, that move 1.3 TB of data each day through the various stages of the DMF's data system. This poster will present the resources and methodology used to capture metadata and the tools that assist in routine data management and discoverability.

  9. [The maintenance of automatic analysers and associated documentation].

    Science.gov (United States)

    Adjidé, V; Fournier, P; Vassault, A

    2010-12-01

    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  10. Video Capture of Plastic Surgery Procedures Using the GoPro HERO 3+

    Directory of Open Access Journals (Sweden)

    Steven Nicholas Graves, MA

    2015-02-01

    Conclusions: The GoPro HERO 3+ Black Edition camera enables high-quality, cost-effective video recording of plastic and reconstructive surgery procedures. When set to a narrow field of view and automatic white balance, the camera is able to sufficiently compensate for the contrasting light environment of the operating room and capture high-resolution, detailed video.

  11. Kajian Pendekatan Binary Log dalam Change Data Capture

    Directory of Open Access Journals (Sweden)

    Muhammad Febrian Rachmadhan Amri

    2017-08-01

    Full Text Available The online business era causes the form of transactions to occur so quickly that the information stored in the data warehouse becomes invalid. Companies are required to have a strong system, which is a system that is real time in order to be able to perform data loading into the media repository that resides on different hosts in the near-real time. Data Warehouse is used as a media repository of data that has the nature of subject-oriented, integrated, time-variant, and is fixed. Data Warehouse can be built into real time management with the advantages possessed and utilize Change Data Capture. Change Data Capture (CDC is a technique that can be used as problem solution to build real time data warehousing (RTDW. The binary log approach in change data capture is made to record any data manipulation activity that occurs at the OLTP level and is managed back before being stored into the Data Warehouse (loading process. This can improve the quality of data management so that the creation of the right information, because the information available is always updated. Testing shows that Binary Log approach in Change Data Capture (BinlogCDC is able to generate real time data management, valid current information, dynamic communication between systems, and data management without losing any information from data manipulation.

  12. Automation of a Beckman liquid scintillation counter for data capture and data-base management

    International Nuclear Information System (INIS)

    Neil, W.; Irwin, T.J.; Yang, J.J.

    1988-01-01

    A software package for the automation of a Beckman LS9000 liquid scintillation counter is presented. The package provides effective on-line data capture (with a Perkin Elmer 3230 32-bit minicomputer), data-base management, audit trail and archiving facilities. Key features of the package are rapid and flexible data entry, background subtraction, half-life correction, ability to queue several sample sets pending scintillation counting, and formatted report generation. A brief discussion is given on the development of customized data processing programs. (author)

  13. AuTom: a novel automatic platform for electron tomography reconstruction

    KAUST Repository

    Han, Renmin

    2017-07-26

    We have developed a software package towards automatic electron tomography (ET): Automatic Tomography (AuTom). The presented package has the following characteristics: accurate alignment modules for marker-free datasets containing substantial biological structures; fully automatic alignment modules for datasets with fiducial markers; wide coverage of reconstruction methods including a new iterative method based on the compressed-sensing theory that suppresses the “missing wedge” effect; and multi-platform acceleration solutions that support faster iterative algebraic reconstruction. AuTom aims to achieve fully automatic alignment and reconstruction for electron tomography and has already been successful for a variety of datasets. AuTom also offers user-friendly interface and auxiliary designs for file management and workflow management, in which fiducial marker-based datasets and marker-free datasets are addressed with totally different subprocesses. With all of these features, AuTom can serve as a convenient and effective tool for processing in electron tomography.

  14. FY 1983 annual report on the research and development of automatic sewing systems. System management/control techniques; 1983 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. System kanri seigyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1984-03-01

    The automatic sewing system technique research association has been commissioned by the Agency of Industrial Science and Technology for (research and development of automatic sewing systems). This program covers R and D of the elementary techniques for total systems and sewing preparation/processing, sewing/assembling, cloth handling, and system management/control. This report describes the results of the R and D efforts for the system management/control techniques. The program for the system management/control techniques involves, first of all, the basic designs for the overall system management, centered by the optimum process configuration and control for reducing lead time of an automatic sewing plant by at least 50% from the current level, based on the production schedules of an apparel maker. The basic designs are then extended to cover examination of defective products at each step, examination and failure diagnosis for prediction, detection and exchange of failed machine parts, systemisation of necessary information to be provided for automatic operation of a sewing plant and providing control-related information, including selection of information media, and information recognition by processing images of cloth and surface conditions/shapes of machine parts. (NEDO)

  15. Database Capture of Natural Language Echocardiographic Reports: A Unified Medical Language System Approach

    OpenAIRE

    Canfield, K.; Bray, B.; Huff, S.; Warner, H.

    1989-01-01

    We describe a prototype system for semi-automatic database capture of free-text echocardiography reports. The system is very simple and uses a Unified Medical Language System compatible architecture. We use this system and a large body of texts to create a patient database and develop a comprehensive hierarchical dictionary for echocardiography.

  16. Study of structure of marine specialist activity in an ergative system on monitoring and managing automatic control parameters of safe navigation process

    Directory of Open Access Journals (Sweden)

    Kholichev S. N.

    2016-12-01

    Full Text Available The study of structures' common features and dynamics of the technical object tuning circuit performing automatic adjustment of safe navigation options has been conducted for the first time in the theory of ergative systems. The research of the structure and process of ergative system functioning including an automatic control system with the option of safe navigation conditions has been fulfilled. The function of signals' selection performing optimal control law reconfiguration of the mentioned system has been given, and some sequence of marine specialist activities allowing solve the problem of navigation safety has been composed. The ergative system retargeted by the ship specialist has a two-tier hierarchy. The first level is an automatic control of the safe navigation parameter, and the second is the level of reconfiguration where the ship specialist changes the parameters of regulation act. The two-level hierarchical representation of the ergative navigation security settings management system makes it possible to introduce the concept of reconfiguration of regulation level as ship specialist activity which is to reduce the uncertainty in the environment in the operation of this layer. Such a reduction can be achieved as a result of exposure to the upper level associated with ideas of the ship specialist on the regulation of safe navigation parameters of the vessel on the lower level – the level of direct control automatic safe navigation option. As a result of studying the activities of the ship specialist in the ergative system on monitoring and managing automatic control parameters of safe navigation process it has been found that the main task of the ship specialist in the operation within the ergative system ensuring the navigation safety is to monitor the input and output of the automatic control system, decisions on the choice of reconfiguration laws regulating signal on the basis of information about deviations and the

  17. Genetic Programming for Automatic Hydrological Modelling

    Science.gov (United States)

    Chadalawada, Jayashree; Babovic, Vladan

    2017-04-01

    One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

  18. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  19. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  20. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  1. Automatic Generation of Facial Expression Using Triangular Geometric Deformation

    OpenAIRE

    Jia-Shing Sheu; Tsu-Shien Hsieh; Ho-Nien Shou

    2014-01-01

    This paper presents an image deformation algorithm and constructs an automatic facial expression generation system to generate new facial expressions in neutral state. After the users input the face image in a neutral state into the system, the system separates the possible facial areas and the image background by skin color segmentation. It then uses the morphological operation to remove noise and to capture the organs of facial expression, such as the eyes, mouth, eyebrow, and nose. The fea...

  2. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    Science.gov (United States)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  3. Automatic real-time detection of endoscopic procedures using temporal features.

    Science.gov (United States)

    Stanek, Sean R; Tavanapong, Wallapak; Wong, Johnny; Oh, Jung Hwan; de Groen, Piet C

    2012-11-01

    Endoscopy is used for inspection of the inner surface of organs such as the colon. During endoscopic inspection of the colon or colonoscopy, a tiny video camera generates a video signal, which is displayed on a monitor for interpretation in real-time by physicians. In practice, these images are not typically captured, which may be attributed by lack of fully automated tools for capturing, analysis of important contents, and quick and easy retrieval of these contents. This paper presents the description and evaluation results of our novel software that uses new metrics based on image color and motion over time to automatically record all images of an individual endoscopic procedure into a single digitized video file. The software automatically discards out-patient video frames between different endoscopic procedures. We validated our software system on 2464 h of live video (over 265 million frames) from endoscopy units where colonoscopy and upper endoscopy were performed. Our previous classification method achieved a frame-based sensitivity of 100.00%, but only a specificity of 89.22%. Our new method achieved a frame-based sensitivity and specificity of 99.90% and 99.97%, a significant improvement. Our system is robust for day-to-day use in medical practice. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Will the energy crisis put an end to development of automatic gearboxes

    Energy Technology Data Exchange (ETDEWEB)

    Ward, D

    1979-12-13

    The power absorbing automatic gearbox may disappear with advancing fuel costs. Electronic controls will be at the center of improvements being developed for possible acceptable small car automatic gearboxes. A total engine management package being developed by Lucas involves extending the microprocessor which with current technology can control both the fuel injection system and the electronic ignition of a car engine to control the automatic gearbox.

  5. Sustainable groundwater use, the capture principle, and adaptive ...

    African Journals Online (AJOL)

    Implications for using the capture principle in the implementation of the NWA are discussed, and adaptive management is proposed as an appropriate management approach. Implications for groundwater monitoring are also discussed. Case studies are described that support the need for adaptive management and the ...

  6. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    Science.gov (United States)

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  7. Heightened attentional capture by visual food stimuli in anorexia nervosa.

    Science.gov (United States)

    Neimeijer, Renate A M; Roefs, Anne; de Jong, Peter J

    2017-08-01

    The present study was designed to test the hypothesis that anorexia nervosa (AN) patients are relatively insensitive to the attentional capture of visual food stimuli. Attentional avoidance of food might help AN patients to prevent more elaborate processing of food stimuli and the subsequent generation of craving, which might enable AN patients to maintain their strict diet. Participants were 66 restrictive AN spectrum patients and 55 healthy controls. A single-target rapid serial visual presentation task was used with food and disorder-neutral cues as critical distracter stimuli and disorder-neutral pictures as target stimuli. AN spectrum patients showed diminished task performance when visual food cues were presented in close temporal proximity of the to-be-identified target. In contrast to our hypothesis, results indicate that food cues automatically capture AN spectrum patients' attention. One explanation could be that the enhanced attentional capture of food cues in AN is driven by the relatively high threat value of food items in AN. Implications and suggestions for future research are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  9. Evaluation of bias associated with capture maps derived from nonlinear groundwater flow models

    Science.gov (United States)

    Nadler, Cara; Allander, Kip K.; Pohll, Greg; Morway, Eric D.; Naranjo, Ramon C.; Huntington, Justin

    2018-01-01

    The impact of groundwater withdrawal on surface water is a concern of water users and water managers, particularly in the arid western United States. Capture maps are useful tools to spatially assess the impact of groundwater pumping on water sources (e.g., streamflow depletion) and are being used more frequently for conjunctive management of surface water and groundwater. Capture maps have been derived using linear groundwater flow models and rely on the principle of superposition to demonstrate the effects of pumping in various locations on resources of interest. However, nonlinear models are often necessary to simulate head-dependent boundary conditions and unconfined aquifers. Capture maps developed using nonlinear models with the principle of superposition may over- or underestimate capture magnitude and spatial extent. This paper presents new methods for generating capture difference maps, which assess spatial effects of model nonlinearity on capture fraction sensitivity to pumping rate, and for calculating the bias associated with capture maps. The sensitivity of capture map bias to selected parameters related to model design and conceptualization for the arid western United States is explored. This study finds that the simulation of stream continuity, pumping rates, stream incision, well proximity to capture sources, aquifer hydraulic conductivity, and groundwater evapotranspiration extinction depth substantially affect capture map bias. Capture difference maps demonstrate that regions with large capture fraction differences are indicative of greater potential capture map bias. Understanding both spatial and temporal bias in capture maps derived from nonlinear groundwater flow models improves their utility and defensibility as conjunctive-use management tools.

  10. Video Capture of Plastic Surgery Procedures Using the GoPro HERO 3+.

    Science.gov (United States)

    Graves, Steven Nicholas; Shenaq, Deana Saleh; Langerman, Alexander J; Song, David H

    2015-02-01

    Significant improvements can be made in recoding surgical procedures, particularly in capturing high-quality video recordings from the surgeons' point of view. This study examined the utility of the GoPro HERO 3+ Black Edition camera for high-definition, point-of-view recordings of plastic and reconstructive surgery. The GoPro HERO 3+ Black Edition camera was head-mounted on the surgeon and oriented to the surgeon's perspective using the GoPro App. The camera was used to record 4 cases: 2 fat graft procedures and 2 breast reconstructions. During cases 1-3, an assistant remotely controlled the GoPro via the GoPro App. For case 4 the GoPro was linked to a WiFi remote, and controlled by the surgeon. Camera settings for case 1 were as follows: 1080p video resolution; 48 fps; Protune mode on; wide field of view; 16:9 aspect ratio. The lighting contrast due to the overhead lights resulted in limited washout of the video image. Camera settings were adjusted for cases 2-4 to a narrow field of view, which enabled the camera's automatic white balance to better compensate for bright lights focused on the surgical field. Cases 2-4 captured video sufficient for teaching or presentation purposes. The GoPro HERO 3+ Black Edition camera enables high-quality, cost-effective video recording of plastic and reconstructive surgery procedures. When set to a narrow field of view and automatic white balance, the camera is able to sufficiently compensate for the contrasting light environment of the operating room and capture high-resolution, detailed video.

  11. Customized recommendations for production management clusters of North American automatic milking systems.

    Science.gov (United States)

    Tremblay, Marlène; Hess, Justin P; Christenson, Brock M; McIntyre, Kolby K; Smink, Ben; van der Kamp, Arjen J; de Jong, Lisanne G; Döpfer, Dörte

    2016-07-01

    Automatic milking systems (AMS) are implemented in a variety of situations and environments. Consequently, there is a need to characterize individual farming practices and regional challenges to streamline management advice and objectives for producers. Benchmarking is often used in the dairy industry to compare farms by computing percentile ranks of the production values of groups of farms. Grouping for conventional benchmarking is commonly limited to the use of a few factors such as farms' geographic region or breed of cattle. We hypothesized that herds' production data and management information could be clustered in a meaningful way using cluster analysis and that this clustering approach would yield better peer groups of farms than benchmarking methods based on criteria such as country, region, breed, or breed and region. By applying mixed latent-class model-based cluster analysis to 529 North American AMS dairy farms with respect to 18 significant risk factors, 6 clusters were identified. Each cluster (i.e., peer group) represented unique management styles, challenges, and production patterns. When compared with peer groups based on criteria similar to the conventional benchmarking standards, the 6 clusters better predicted milk produced (kilograms) per robot per day. Each cluster represented a unique management and production pattern that requires specialized advice. For example, cluster 1 farms were those that recently installed AMS robots, whereas cluster 3 farms (the most northern farms) fed high amounts of concentrates through the robot to compensate for low-energy feed in the bunk. In addition to general recommendations for farms within a cluster, individual farms can generate their own specific goals by comparing themselves to farms within their cluster. This is very comparable to benchmarking but adds the specific characteristics of the peer group, resulting in better farm management advice. The improvement that cluster analysis allows for is

  12. AN INVESTIGATION OF AUTOMATIC CHANGE DETECTION FOR TOPOGRAPHIC MAP UPDATING

    Directory of Open Access Journals (Sweden)

    P. Duncan

    2012-08-01

    Full Text Available Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI, South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  13. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    Science.gov (United States)

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  14. Privacy-preserving screen capture: towards closing the loop for health IT usability.

    Science.gov (United States)

    Cooley, Joseph; Smith, Sean

    2013-08-01

    As information technology permeates healthcare (particularly provider-facing systems), maximizing system effectiveness requires the ability to document and analyze tricky or troublesome usage scenarios. However, real-world health IT systems are typically replete with privacy-sensitive data regarding patients, diagnoses, clinicians, and EMR user interface details; instrumentation for screen capture (capturing and recording the scenario depicted on the screen) needs to respect these privacy constraints. Furthermore, real-world health IT systems are typically composed of modules from many sources, mission-critical and often closed-source; any instrumentation for screen capture can rely neither on access to structured output nor access to software internals. In this paper, we present a tool to help solve this problem: a system that combines keyboard video mouse (KVM) capture with automatic text redaction (and interactively selectable unredaction) to produce precise technical content that can enrich stakeholder communications and improve end-user influence on system evolution. KVM-based capture makes our system both application-independent and OS-independent because it eliminates software-interface dependencies on capture targets. Using a corpus of EMR screenshots, we present empirical measurements of redaction effectiveness and processing latency to demonstrate system performances. We discuss how these techniques can translate into instrumentation systems that improve real-world health IT deployments. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Techniques for capturing bighorn sheep lambs

    Science.gov (United States)

    Smith, Joshua B.; Walsh, Daniel P.; Goldstein, Elise J.; Parsons, Zachary D.; Karsch, Rebekah C.; Stiver, Julie R.; Cain, James W.; Raedeke, Kenneth J.; Jenks, Jonathan A.

    2014-01-01

    Low lamb recruitment is a major challenge facing managers attempting to mitigate the decline of bighorn sheep (Ovis canadensis), and investigations into the underlying mechanisms are limited because of the inability to readily capture and monitor bighorn sheep lambs. We evaluated 4 capture techniques for bighorn sheep lambs: 1) hand-capture of lambs from radiocollared adult females fitted with vaginal implant transmitters (VITs), 2) hand-capture of lambs of intensively monitored radiocollared adult females, 3) helicopter net-gunning, and 4) hand-capture of lambs from helicopters. During 2010–2012, we successfully captured 90% of lambs from females that retained VITs to ≤1 day of parturition, although we noted differences in capture rates between an area of high road density in the Black Hills (92–100%) of South Dakota, USA, and less accessible areas of New Mexico (71%), USA. Retention of VITs was 78% with pre-partum expulsion the main cause of failure. We were less likely to capture lambs from females that expelled VITs ≥1 day of parturition (range = 80–83%) or females that were collared without VITs (range = 60–78%). We used helicopter net-gunning at several sites in 1999, 2001–2002, and 2011, and it proved a useful technique; however, at one site, attempts to capture lambs led to lamb predation by golden eagles (Aquila chrysaetos). We attempted helicopter hand-captures at one site in 1999, and they also were successful in certain circumstances and avoided risk of physical trauma from net-gunning; however, application was limited. In areas of low accessibility or if personnel lack the ability to monitor females and/or VITs for extended periods, helicopter capture may provide a viable option for lamb capture.

  16. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    Science.gov (United States)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  17. A proposal for a course of Operations Management for the Degree in Electronics and Automatic

    Directory of Open Access Journals (Sweden)

    Pilar I. Vidal-Carreras

    2017-06-01

    Full Text Available At this work a methodology is proposed for a course of the discipline of Operations Management with a focus on active methodologies in the degree of Electronics and Automatic. For the course is combined: lecture, group work, problem-based learning, project-based learning and presentation of group work. Previous experiences in the same course allow us to conclude the importance of the lecture in this environment in what is the only course of the discipline in all the degree. The importance of feedback in project learning is not easy for large groups such as the case study, suggesting the presentation of group work as a good solution to the problem

  18. Very Portable Remote Automatic Weather Stations

    Science.gov (United States)

    John R. Warren

    1987-01-01

    Remote Automatic Weather Stations (RAWS) were introduced to Forest Service and Bureau of Land Management field units in 1978 following development, test, and evaluation activities conducted jointly by the two agencies. The original configuration was designed for semi-permanent installation. Subsequently, a need for a more portable RAWS was expressed, and one was...

  19. Nonlinear Synchronization for Automatic Learning of 3D Pose Variability in Human Motion Sequences

    Directory of Open Access Journals (Sweden)

    Mozerov M

    2010-01-01

    Full Text Available A dense matching algorithm that solves the problem of synchronizing prerecorded human motion sequences, which show different speeds and accelerations, is proposed. The approach is based on minimization of MRF energy and solves the problem by using Dynamic Programming. Additionally, an optimal sequence is automatically selected from the input dataset to be a time-scale pattern for all other sequences. The paper utilizes an action specific model which automatically learns the variability of 3D human postures observed in a set of training sequences. The model is trained using the public CMU motion capture dataset for the walking action, and a mean walking performance is automatically learnt. Additionally, statistics about the observed variability of the postures and motion direction are also computed at each time step. The synchronized motion sequences are used to learn a model of human motion for action recognition and full-body tracking purposes.

  20. Automatic capture of attention by conceptually generated working memory templates.

    Science.gov (United States)

    Sun, Sol Z; Shen, Jenny; Shaw, Mark; Cant, Jonathan S; Ferber, Susanne

    2015-08-01

    Many theories of attention propose that the contents of working memory (WM) can act as an attentional template, which biases processing in favor of perceptually similar inputs. While support has been found for this claim, it is unclear how attentional templates are generated when searching real-world environments. We hypothesized that in naturalistic settings, attentional templates are commonly generated from conceptual knowledge, an idea consistent with sensorimotor models of knowledge representation. Participants performed a visual search task in the delay period of a WM task, where the item in memory was either a colored disk or a word associated with a color concept (e.g., "Rose," associated with red). During search, we manipulated whether a singleton distractor in the array matched the contents of WM. Overall, we found that search times were impaired in the presence of a memory-matching distractor. Furthermore, the degree of impairment did not differ based on the contents of WM. Put differently, regardless of whether participants were maintaining a perceptually colored disk identical to the singleton distractor, or whether they were simply maintaining a word associated with the color of the distractor, the magnitude of attentional capture was the same. Our results suggest that attentional templates can be generated from conceptual knowledge, in the physical absence of the visual feature.

  1. U-Note: Capture the Class and Access it Everywhere

    DEFF Research Database (Denmark)

    Malacria, Sylvain; Pietrzak, Thomas; Tabard, Aurélien

    2011-01-01

    We present U-Note, an augmented teaching and learning system leveraging the advantages of paper while letting teachers and pupils benefit from the richness that digital media can bring to a lecture. U-Note provides automatic linking between the notes of the pupils’ notebooks and various events...... on three modules. U-Teach captures the context of the class: audio recordings, the whiteboard contents, together with the web pages, videos and slideshows displayed during the lesson. U-Study binds pupils’ paper notes (taken with an Anoto digital pen) with the data coming from U-Teach and lets pupils...

  2. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  3. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  4. Anesthetic management of Boron Neutron Capture Therapy for glioblastoma

    International Nuclear Information System (INIS)

    Shinomura, T.; Furutani, H.; Osawa, M.; Ono, K.; Fukuda, K.

    2000-01-01

    General anesthesia was given to twenty-seven patients who received Boron Neutron Capture Therapy (BNCT) under craniotomy at Kyoto University Research Reactor from 1991 to 1999. Special considerations are required for anesthesia. (author)

  5. Knowledge Capture and Acquisition Mechanisms at Kisii University

    Directory of Open Access Journals (Sweden)

    Nemwel Aming'a

    2015-07-01

    Full Text Available Knowledge management and knowledge assets have gained much prominence in recent years and are said to improve organizational performance. Knowledge capture and acquisition mechanisms enhance organizational memory and performance. However, knowledge capture and acquisition mechanisms in higher education institutions are not well known. The aim of this study was to investigate the knowledge capture and acquisition mechanisms at Kisii University. This was a case study in which data were collected through interviews and questionnaires. Purposive sampling was used to determine interview participants while questionnaire respondents were selected through stratified random sampling. Qualitative and quantitative data were analysed using SPSS® student version 14; it revealed that there were various knowledge capture and acquisition mechanisms at Kisii University. It was also established that the University encountered various challenges in knowledge capture and acquisition and lacked some essential knowledge capture and acquisition mechanisms. In this regard, this study proposed knowledge capture and acquisition guidelines that may be adopted by the University to enhance its organizational memory and performance.

  6. Collaborative Manufacturing Management in Networked Supply Chains

    Science.gov (United States)

    Pouly, Michel; Naciri, Souleiman; Berthold, Sébastien

    ERP systems provide information management and analysis to industrial companies and support their planning activities. They are currently mostly based on theoretical values (averages) of parameters and not on the actual, real shop floor data, leading to disturbance of the planning algorithms. On the other hand, sharing data between manufacturers, suppliers and customers becomes very important to ensure reactivity towards markets variability. This paper proposes software solutions to address these requirements and methods to automatically capture the necessary corresponding shop floor information. In order to share data produced by different legacy systems along the collaborative networked supply chain, we propose to use the Generic Product Model developed by Hitachi to extract, translate and store heterogeneous ERP data.

  7. US Spacesuit Knowledge Capture

    Science.gov (United States)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2011-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes

  8. Automatic multiresolution age-related macular degeneration detection from fundus images

    Science.gov (United States)

    Garnier, Mickaël.; Hurtut, Thomas; Ben Tahar, Houssem; Cheriet, Farida

    2014-03-01

    Age-related Macular Degeneration (AMD) is a leading cause of legal blindness. As the disease progress, visual loss occurs rapidly, therefore early diagnosis is required for timely treatment. Automatic, fast and robust screening of this widespread disease should allow an early detection. Most of the automatic diagnosis methods in the literature are based on a complex segmentation of the drusen, targeting a specific symptom of the disease. In this paper, we present a preliminary study for AMD detection from color fundus photographs using a multiresolution texture analysis. We analyze the texture at several scales by using a wavelet decomposition in order to identify all the relevant texture patterns. Textural information is captured using both the sign and magnitude components of the completed model of Local Binary Patterns. An image is finally described with the textural pattern distributions of the wavelet coefficient images obtained at each level of decomposition. We use a Linear Discriminant Analysis for feature dimension reduction, to avoid the curse of dimensionality problem, and image classification. Experiments were conducted on a dataset containing 45 images (23 healthy and 22 diseased) of variable quality and captured by different cameras. Our method achieved a recognition rate of 93:3%, with a specificity of 95:5% and a sensitivity of 91:3%. This approach shows promising results at low costs that in agreement with medical experts as well as robustness to both image quality and fundus camera model.

  9. Automatic decomposition of a complex hologram based on the virtual diffraction plane framework

    International Nuclear Information System (INIS)

    Jiao, A S M; Tsang, P W M; Lam, Y K; Poon, T-C; Liu, J-P; Lee, C-C

    2014-01-01

    Holography is a technique for capturing the hologram of a three-dimensional scene. In many applications, it is often pertinent to retain specific items of interest in the hologram, rather than retaining the full information, which may cause distraction in the analytical process that follows. For a real optical image that is captured with a camera or scanner, this process can be realized by applying image segmentation algorithms to decompose an image into its constituent entities. However, because it is different from an optical image, classic image segmentation methods cannot be applied directly to a hologram, as each pixel in the hologram carries holistic, rather than local, information of the object scene. In this paper, we propose a method to perform automatic decomposition of a complex hologram based on a recently proposed technique called the virtual diffraction plane (VDP) framework. Briefly, a complex hologram is back-propagated to a hypothetical plane known as the VDP. Next, the image on the VDP is automatically decomposed, through the use of the segmentation on the magnitude of the VDP image, into multiple sub-VDP images, each representing the diffracted waves of an isolated entity in the scene. Finally, each sub-VDP image is reverted back to a hologram. As such, a complex hologram can be decomposed into a plurality of subholograms, each representing a discrete object in the scene. We have demonstrated the successful performance of our proposed method by decomposing a complex hologram that is captured through the optical scanning holography (OSH) technique. (papers)

  10. Data-driven management using quantitative metric and automatic auditing program (QMAP) improves consistency of radiation oncology processes.

    Science.gov (United States)

    Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H

    Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  11. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    Energy Technology Data Exchange (ETDEWEB)

    Wu, C [Sutter Medical Foundation, Roseville, CA (United States)

    2016-06-15

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  12. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    International Nuclear Information System (INIS)

    Wu, C

    2016-01-01

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  13. Data collection automation and total quality management: case studies in the health-service industry.

    Science.gov (United States)

    Smith, Alan D; Offodile, O Felix

    2008-01-01

    The limitations, immeasurable, and seemly unquantifiable aspects of the healthcare service industry, make it imperative that quality assurance programs include total quality management (TQM) and automatic identification and data capture (AIDC)-related technologies. Most of standards used in TQM and AIDC require data, to measure improvement and achieve standardization. Major difference between managing a service firm and managing a product-manufacturing firm is the difficulty of achieving consistently high quality. Examination of two different healthcare service providers in the Pittsburgh, Pennsylvania area offers different views as to the implementation and practice of total quality management techniques and AIDC integration. Since the healthcare service industry must take into account its high customization needs, there are positive steps to make the hospital structure itself more patient friendly and quality related; hence improving its heath-marketing strategies to the general public.

  14. Data Model Management for Space Information Systems

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool

  15. A mobile and asynchronous electronic data capture system for epidemiologic studies.

    Science.gov (United States)

    Meyer, Jens; Fredrich, Daniel; Piegsa, Jens; Habes, Mohamad; van den Berg, Neeltje; Hoffmann, Wolfgang

    2013-06-01

    A Central Data Management (CDM) system based on electronic data capture (EDC) software and study specific databases is an essential component for assessment and management of large data volumes in epidemiologic studies. Conventional CDM systems using web applications for data capture depend on permanent access to a network. However, in many study settings permanent network access cannot be guaranteed, e.g. when participants/patients are visited in their homes. In such cases a different concept for data capture is needed. The utilized EDC software must be able to ensure data capture as stand-alone instance and to synchronize captured data with the server at a later point in time. This article describes the design of the mobile information capture (MInCa) system an EDC software meeting these requirements. In particular, we focus on client and server design, data synchronization, and data privacy as well as data security measures. The MInCa software has already proven its efficiency in epidemiologic studies revealing strengths and weaknesses concerning both concept and practical application which will be addressed in this article. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. Capturing the Sun: A Roadmap for Navigating Data-Access Challenges and Auto-Populating Solar Home Sales Listings

    Energy Technology Data Exchange (ETDEWEB)

    Stukel, Laura [Elevate Energy, Chicago, IL (United States); Hoen, Ben [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Adomatis, Sandra [Adomatis Appraisal Services, Punta Gorda, FL (United States); Foley, Craig [Sustainable Real Estate Consulting Services, Somerville, MA (United States); Parsons, Laura [Center for Sustainable Energy, San Diego, CA (United States); James, Mark [Vermont Law School, South Royalton, VT (United States). Inst. for Energy and Environment; Mastor, Roxana-Andreea [Vermont Law School, South Royalton, VT (United States). Inst. for Energy and Environment; Wedewer, Lindsey [Colorado Energy Office, Denver, CO (United States)

    2017-04-13

    Capturing the Sun: A Roadmap for Navigating Data-Access Challenges and Auto-Populating Solar Home Sales Listings supports a vision of solar photovoltaic (PV) advocates and real estate advocates evolving together to make information about solar homes more accessible to home buyers and sellers and to simplify the process when these homes are resold. The Roadmap is based on a concept in the real estate industry known as automatic population of fields. Auto-population (also called auto-pop in the industry) is the technology that allows data aggregated by an outside industry to be matched automatically with home sale listings in a multiple listing service (MLS).

  17. Development of on line automatic separation device for apple and sleeve

    Science.gov (United States)

    Xin, Dengke; Ning, Duo; Wang, Kangle; Han, Yuhang

    2018-04-01

    Based on STM32F407 single chip microcomputer as control core, automatic separation device of fruit sleeve is designed. This design consists of hardware and software. In hardware, it includes mechanical tooth separator and three degree of freedom manipulator, as well as industrial control computer, image data acquisition card, end effector and other structures. The software system is based on Visual C++ development environment, to achieve localization and recognition of fruit sleeve with the technology of image processing and machine vision, drive manipulator of foam net sets of capture, transfer, the designated position task. Test shows: The automatic separation device of the fruit sleeve has the advantages of quick response speed and high separation success rate, and can realize separation of the apple and plastic foam sleeve, and lays the foundation for further studying and realizing the application of the enterprise production line.

  18. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  19. On the Relationship Between Automatic Attitudes and Self-Reported Sexual Assault in Men

    Science.gov (United States)

    Widman, Laura; Olson, Michael

    2013-01-01

    Research and theory suggest rape supportive attitudes are important predictors of sexual assault; yet, to date, rape supportive attitudes have been assessed exclusively through self-report measures that are methodologically and theoretically limited. To address these limitations, the objectives of the current project were to: (1) develop a novel implicit rape attitude assessment that captures automatic attitudes about rape and does not rely on self-reports, and (2) examine the association between automatic rape attitudes and sexual assault perpetration. We predicted that automatic rape attitudes would be a significant unique predictor of sexual assault even when self-reported rape attitudes (i.e., rape myth acceptance and hostility toward women) were controlled. We tested the generalizability of this prediction in two independent samples: a sample of undergraduate college men (n = 75, M age = 19.3 years) and a sample of men from the community (n = 50, M age = 35.9 years). We found the novel implicit rape attitude assessment was significantly associated with the frequency of sexual assault perpetration in both samples and contributed unique variance in explaining sexual assault beyond rape myth acceptance and hostility toward women. We discuss the ways in which future research on automatic rape attitudes may significantly advance measurement and theory aimed at understanding and preventing sexual assault. PMID:22618119

  20. Automatic, time-interval traffic counts for recreation area management planning

    Science.gov (United States)

    D. L. Erickson; C. J. Liu; H. K. Cordell

    1980-01-01

    Automatic, time-interval recorders were used to count directional vehicular traffic on a multiple entry/exit road network in the Red River Gorge Geological Area, Daniel Boone National Forest. Hourly counts of entering and exiting traffic differed according to recorder location, but an aggregated distribution showed a delayed peak in exiting traffic thought to be...

  1. Energy management systems in buildings

    Energy Technology Data Exchange (ETDEWEB)

    Lush, D. M.

    1979-07-01

    An investigation is made of the range of possibilities available from three types of systems (automatic control devices, building envelope, and the occupants) in buildings. The following subjects are discussed: general (buildings, design and personnel); new buildings (envelope, designers, energy and load calculations, plant design, general design parameters); existing buildings (conservation measures, general energy management, air conditioned buildings, industrial buildings); man and motivation (general, energy management and documentation, maintenance, motivation); automatic energy management systems (thermostatic controls, optimized plant start up, air conditioned and industrial buildings, building automatic systems). (MCW)

  2. Development of a digital guidance and control law for steep approach automatic landings using modern control techniques

    Science.gov (United States)

    Halyo, N.

    1979-01-01

    The development of a digital automatic control law for a small jet transport to perform a steep final approach in automatic landings is reported along with the development of a steady-state Kalman filter used to provide smooth estimates to the control law. The control law performs the functions of localizer and glides capture, localizer and glideslope track, decrab, and place. The control law uses the microwave landing system position data, and aircraft body-mounted accelerators, attitude and attitude rate information. The results obtained from a digital simulation of the aircraft dynamics, wind conditions, and sensor noises using the control law and filter developed are described.

  3. Automatic programmable air ozonizer

    International Nuclear Information System (INIS)

    Gubarev, S.P.; Klosovsky, A.V.; Opaleva, G.P.; Taran, V.S.; Zolototrubova, M.I.

    2015-01-01

    In this paper we describe a compact, economical, easy to manage auto air ozonator developed at the Institute of Plasma Physics of the NSC KIPT. It is designed for sanitation, disinfection of premises and cleaning the air from foreign odors. A distinctive feature of the developed device is the generation of a given concentration of ozone, approximately 0.7 maximum allowable concentration (MAC), and automatic maintenance of a specified level. This allows people to be inside the processed premises during operation. The microprocessor controller to control the operation of the ozonator was developed

  4. PcapDB: Search Optimized Packet Capture, Version 0.1.0.0

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-04

    PcapDB is a packet capture system designed to optimize the captured data for fast search in the typical (network incident response) use case. The technology involved in this software has been submitted via the IDEAS system and has been filed as a provisional patent. It includes the following primary components: capture: The capture component utilizes existing capture libraries to retrieve packets from network interfaces. Once retrieved the packets are passed to additional threads for sorting into flows and indexing. The sorted flows and indexes are passed to other threads so that they can be written to disk. These components are written in the C programming language. search: The search components provide a means to find relevant flows and the associated packets. A search query is parsed and represented as a search tree. Various search commands, written in C, are then used resolve this tree into a set of search results. The tree generation and search execution management components are written in python. interface: The PcapDB web interface is written in Python on the Django framework. It provides a series of pages, API's, and asynchronous tasks that allow the user to manage the capture system, perform searches, and retrieve results. Web page components are written in HTML,CSS and Javascript.

  5. A welfare study into capture fisheries in cirata reservoir: a bio-economic model

    Science.gov (United States)

    Anna, Z.; Hindayani, P.

    2018-04-01

    Capture fishery in inland such as reservoirs can be a source of food security and even the economy and public welfare of the surrounding community. This research was conducted on Cirata reservoir fishery in West Java, to see how far reservoir capture fishery can contribute economically in the form of resource rents. The method used is the bioeconomic model Copes, which can analyze the demand and supply functions to calculate the optimization of stakeholders’ welfare in various management regimes. The results showed that the management of capture fishery using Maximum Economic Yield regime (MEY) gave the most efficient result, where fewer inputs would produce maximum profit. In the MEY management, the producer surplus obtained is IDR 2,610.203.099, - per quarter and IDR 273.885.400,- of consumer surplus per quarter. Furthermore, researches showed that sustainable management regime policy MEY result in the government rent/surplus ofIDR 217.891,345, - per quarter with the average price of fish per kg being IDR 13.929. In open access fishery, it was shown that the producer surplus becomesIDR 0. Thus the implementation of the MEY-based instrument policy becomes a necessity for Cirata reservoir capture fishery.

  6. FY 1984 annual report on the research and development of automatic sewing systems. System management/control techniques; 1984 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. System kanri seigyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1985-03-01

    The automatic sewing system technique research association has been commissioned by the Agency of Industrial Science and Technology for (research and development of automatic sewing systems). This program covers R and D of the elementary techniques for total systems and sewing preparation/processing, sewing/assembling, cloth handling, and system management/control. This report describes the results of the R and D efforts for the system management/control techniques. The FY 1984 efforts are directed to the basic designs for optimizing process configurations, load balances, and control of sewing/assembling devices, transfer devices and the like using a computer for controlling purposes, based on the production schedules in which demand forecasts by apparel makers are reflected, in order to reduce lead time from charging cloth into an automatic sewing plant to delivery of the products by at least 50% from the current level by improving processing efficiency and speed. The plan for development of the examination/failure diagnosis method, as the next R and D theme, is drawn mainly for determining positions of product quality standard setting-up, examination of individual steps and failure diagnosis functions of sewing machines, classification of the failure levels, and prediction of failure. Prospects of establishing an automatic system for visual recognition are obtained. (NEDO)

  7. MASTR-MS: a web-based collaborative laboratory information management system (LIMS) for metabolomics.

    Science.gov (United States)

    Hunter, Adam; Dayalan, Saravanan; De Souza, David; Power, Brad; Lorrimar, Rodney; Szabo, Tamas; Nguyen, Thu; O'Callaghan, Sean; Hack, Jeremy; Pyke, James; Nahid, Amsha; Barrero, Roberto; Roessner, Ute; Likic, Vladimir; Tull, Dedreia; Bacic, Antony; McConville, Malcolm; Bellgard, Matthew

    2017-01-01

    An increasing number of research laboratories and core analytical facilities around the world are developing high throughput metabolomic analytical and data processing pipelines that are capable of handling hundreds to thousands of individual samples per year, often over multiple projects, collaborations and sample types. At present, there are no Laboratory Information Management Systems (LIMS) that are specifically tailored for metabolomics laboratories that are capable of tracking samples and associated metadata from the beginning to the end of an experiment, including data processing and archiving, and which are also suitable for use in large institutional core facilities or multi-laboratory consortia as well as single laboratory environments. Here we present MASTR-MS, a downloadable and installable LIMS solution that can be deployed either within a single laboratory or used to link workflows across a multisite network. It comprises a Node Management System that can be used to link and manage projects across one or multiple collaborating laboratories; a User Management System which defines different user groups and privileges of users; a Quote Management System where client quotes are managed; a Project Management System in which metadata is stored and all aspects of project management, including experimental setup, sample tracking and instrument analysis, are defined, and a Data Management System that allows the automatic capture and storage of raw and processed data from the analytical instruments to the LIMS. MASTR-MS is a comprehensive LIMS solution specifically designed for metabolomics. It captures the entire lifecycle of a sample starting from project and experiment design to sample analysis, data capture and storage. It acts as an electronic notebook, facilitating project management within a single laboratory or a multi-node collaborative environment. This software is being developed in close consultation with members of the metabolomics research

  8. Automatic Generation of 3D Caricatures Based on Artistic Deformation Styles.

    Science.gov (United States)

    Clarke, Lyndsey; Chen, Min; Mora, Benjamin

    2011-06-01

    Caricatures are a form of humorous visual art, usually created by skilled artists for the intention of amusement and entertainment. In this paper, we present a novel approach for automatic generation of digital caricatures from facial photographs, which capture artistic deformation styles from hand-drawn caricatures. We introduced a pseudo stress-strain model to encode the parameters of an artistic deformation style using "virtual" physical and material properties. We have also developed a software system for performing the caricaturistic deformation in 3D which eliminates the undesirable artifacts in 2D caricaturization. We employed a Multilevel Free-Form Deformation (MFFD) technique to optimize a 3D head model reconstructed from an input facial photograph, and for controlling the caricaturistic deformation. Our results demonstrated the effectiveness and usability of the proposed approach, which allows ordinary users to apply the captured and stored deformation styles to a variety of facial photographs.

  9. Concurrent engineering design and management knowledge capture

    Science.gov (United States)

    1990-01-01

    The topics are presented in viewgraph form and include the following: real-time management, personnel management, project management, conceptual design and decision making; the SITRF design problem; and the electronic-design notebook.

  10. Report on achievements of research and development of an automatic sewing system in fiscal 1986. System management and control technology; 1986 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. System kanri seigyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    The Automatic Sewing System Technology Research Association was commissioned from the Agency of Industrial Science and Technology on the 'development of an automatic sewing system'. The association is performing the research and development by dividing the subject into such component technologies as a total system and sewing preparation and processing technology, a sewing and assembling technology, a cloth handling technology, and a system management and control technology. This paper reports the achievement of the research and development on the system management and control technology. Plans have been established on the following items: dimensional inspection of sewn parts by using an image processing technology, real-time processing of the matching work interlocked with a striped pattern cloth handling device, and a processing positioning algorithm to instruct sewing lines and needle dropping points based on contour lines of pockets and collars. The effects thereof were verified. Printing seven-segmented numerals on cloth by using fluorescent whitener and an automatic reading test of the printed information verified that the technology can be applied to tracking the parts in a processing line as an ID information. Researches were also made on unification of micro-computers for controlling such automatic machines as a multi-functional sewing machine, a three-dimensional sewing machine, and a cloth handling and positioning device, as well as on interface with the computer for function sharing management and control. (NEDO)

  11. Report on achievements of research and development of an automatic sewing system in fiscal 1986. System management and control technology; 1986 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. System kanri seigyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    The Automatic Sewing System Technology Research Association was commissioned from the Agency of Industrial Science and Technology on the 'development of an automatic sewing system'. The association is performing the research and development by dividing the subject into such component technologies as a total system and sewing preparation and processing technology, a sewing and assembling technology, a cloth handling technology, and a system management and control technology. This paper reports the achievement of the research and development on the system management and control technology. Plans have been established on the following items: dimensional inspection of sewn parts by using an image processing technology, real-time processing of the matching work interlocked with a striped pattern cloth handling device, and a processing positioning algorithm to instruct sewing lines and needle dropping points based on contour lines of pockets and collars. The effects thereof were verified. Printing seven-segmented numerals on cloth by using fluorescent whitener and an automatic reading test of the printed information verified that the technology can be applied to tracking the parts in a processing line as an ID information. Researches were also made on unification of micro-computers for controlling such automatic machines as a multi-functional sewing machine, a three-dimensional sewing machine, and a cloth handling and positioning device, as well as on interface with the computer for function sharing management and control. (NEDO)

  12. A cloud-based system for automatic glaucoma screening.

    Science.gov (United States)

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  13. Packet Capture Solutions: PcapDB Benchmark for High-Bandwidth Capture, Storage, and Searching

    Energy Technology Data Exchange (ETDEWEB)

    Steinfadt, Shannon Irene [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferrell, Paul Steven [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-21

    PcapDB stands alone when looking at the overall field of competitors, from the cost-effective COTS hardware, to the efficient utilization of disk space that enables a longer packet history. A scalable, 100GbE-enabled system that indexes every packet and indexes flow data without complicated load-balancing requirements. The Transport Layer search and indexing approach led to patent-pending flow indexing technology, providing a specialized database system specifically optimized around providing fast flow searches. While there are a plethora of options in network packet capture, there are very few that are able to effectively manage capture rates of more than 10 Gb/s, distributed capture and querying, and a responsive user interface. By far, the primary competitor in the market place is Endace and DeepSee; in addition to meeting the technical requirements we set out in this document, they provide technical support and a fully 'appliance like' system. In terms of cost, however, our experience has been that the yearly maintenance charges alone outstrip the entire hardware cost of solutions like PcapDB. Investment in cyber security research and development is a large part of what has enabled us to build the base of knowlegable workers needed to defend government resources in the rapidly evolving cyber security landscape. We believe projects like Bro, WireCap, and Farm do more than just fill temporary gaps in our capabilities. They give allow us to build the firm foundation needed to tackle the next generation of cyber challenges. PcapDB was built with loftier ambitions than simply solving the packet capture of a single lab site, but instead to provide a robust, scaleable packet capture solution to the DOE complex and beyond.

  14. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  15. Automatic Speech Recognition from Neural Signals: A Focused Review

    Directory of Open Access Journals (Sweden)

    Christian Herff

    2016-09-01

    Full Text Available Speech interfaces have become widely accepted and are nowadays integrated in various real-life applications and devices. They have become a part of our daily life. However, speech interfaces presume the ability to produce intelligible speech, which might be impossible due to either loud environments, bothering bystanders or incapabilities to produce speech (i.e.~patients suffering from locked-in syndrome. For these reasons it would be highly desirable to not speak but to simply envision oneself to say words or sentences. Interfaces based on imagined speech would enable fast and natural communication without the need for audible speech and would give a voice to otherwise mute people.This focused review analyzes the potential of different brain imaging techniques to recognize speech from neural signals by applying Automatic Speech Recognition technology. We argue that modalities based on metabolic processes, such as functional Near Infrared Spectroscopy and functional Magnetic Resonance Imaging, are less suited for Automatic Speech Recognition from neural signals due to low temporal resolution but are very useful for the investigation of the underlying neural mechanisms involved in speech processes. In contrast, electrophysiologic activity is fast enough to capture speech processes and is therefor better suited for ASR. Our experimental results indicate the potential of these signals for speech recognition from neural data with a focus on invasively measured brain activity (electrocorticography. As a first example of Automatic Speech Recognition techniques used from neural signals, we discuss the emph{Brain-to-text} system.

  16. Capture and exploration of sample quality data to inform and improve the management of a screening collection.

    Science.gov (United States)

    Charles, Isabel; Sinclair, Ian; Addison, Daniel H

    2014-04-01

    A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.

  17. An enhanced model for automatically extracting topic phrase from ...

    African Journals Online (AJOL)

    The key benefit foreseen from this automatic document classification is not only related to search engines, but also to many other fields like, document organization, text filtering and semantic index managing. Key words: Keyphrase extraction, machine learning, search engine snippet, document classification, topic tracking ...

  18. An automatic fault management model for distribution networks

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Haenninen, S [VTT Energy, Espoo (Finland); Seppaenen, M [North-Carelian Power Co (Finland); Antila, E; Markkila, E [ABB Transmit Oy (Finland)

    1998-08-01

    An automatic computer model, called the FI/FL-model, for fault location, fault isolation and supply restoration is presented. The model works as an integrated part of the substation SCADA, the AM/FM/GIS system and the medium voltage distribution network automation systems. In the model, three different techniques are used for fault location. First, by comparing the measured fault current to the computed one, an estimate for the fault distance is obtained. This information is then combined, in order to find the actual fault point, with the data obtained from the fault indicators in the line branching points. As a third technique, in the absence of better fault location data, statistical information of line section fault frequencies can also be used. For combining the different fault location information, fuzzy logic is used. As a result, the probability weights for the fault being located in different line sections, are obtained. Once the faulty section is identified, it is automatically isolated by remote control of line switches. Then the supply is restored to the remaining parts of the network. If needed, reserve connections from other adjacent feeders can also be used. During the restoration process, the technical constraints of the network are checked. Among these are the load carrying capacity of line sections, voltage drop and the settings of relay protection. If there are several possible network topologies, the model selects the technically best alternative. The FI/IL-model has been in trial use at two substations of the North-Carelian Power Company since November 1996. This chapter lists the practical experiences during the test use period. Also the benefits of this kind of automation are assessed and future developments are outlined

  19. Comparison of acute and chronic traumatic brain injury using semi-automatic multimodal segmentation of MR volumes.

    Science.gov (United States)

    Irimia, Andrei; Chambers, Micah C; Alger, Jeffry R; Filippou, Maria; Prastawa, Marcel W; Wang, Bo; Hovda, David A; Gerig, Guido; Toga, Arthur W; Kikinis, Ron; Vespa, Paul M; Van Horn, John D

    2011-11-01

    Although neuroimaging is essential for prompt and proper management of traumatic brain injury (TBI), there is a regrettable and acute lack of robust methods for the visualization and assessment of TBI pathophysiology, especially for of the purpose of improving clinical outcome metrics. Until now, the application of automatic segmentation algorithms to TBI in a clinical setting has remained an elusive goal because existing methods have, for the most part, been insufficiently robust to faithfully capture TBI-related changes in brain anatomy. This article introduces and illustrates the combined use of multimodal TBI segmentation and time point comparison using 3D Slicer, a widely-used software environment whose TBI data processing solutions are openly available. For three representative TBI cases, semi-automatic tissue classification and 3D model generation are performed to perform intra-patient time point comparison of TBI using multimodal volumetrics and clinical atrophy measures. Identification and quantitative assessment of extra- and intra-cortical bleeding, lesions, edema, and diffuse axonal injury are demonstrated. The proposed tools allow cross-correlation of multimodal metrics from structural imaging (e.g., structural volume, atrophy measurements) with clinical outcome variables and other potential factors predictive of recovery. In addition, the workflows described are suitable for TBI clinical practice and patient monitoring, particularly for assessing damage extent and for the measurement of neuroanatomical change over time. With knowledge of general location, extent, and degree of change, such metrics can be associated with clinical measures and subsequently used to suggest viable treatment options.

  20. AUTOMATIC CONTROL SYSTEM OF WINTER AUTOMOBILE-ROAD MAINTENANCE

    Directory of Open Access Journals (Sweden)

    I. I. Leonovich

    2008-01-01

    Full Text Available In order to ensure a rational usage of financial and material resources directed on winter automobile-road maintenance in theRepublicofBelarusan automatic control system of winter maintenance is under its development and introduction.  The main purpose of the system is to obtain and use meteorological information on the state of a road network that allows to take necessary organizational and technological solutions ensuring safety and continuity of traffic during winter. This system also presupposes to ensure constant control over the state of roadway covering, expenditure of anti-glazed frost materials at all levels of management.The paper considers main aspects pertaining to introduction of the automatic control system of winter maintenance

  1. Automatic Calculation of Hydrostatic Pressure Gradient in Patients with Head Injury: A Pilot Study.

    Science.gov (United States)

    Moss, Laura; Shaw, Martin; Piper, Ian; Arvind, D K; Hawthorne, Christopher

    2016-01-01

    The non-surgical management of patients with traumatic brain injury is the treatment and prevention of secondary insults, such as low cerebral perfusion pressure (CPP). Most clinical pressure monitoring systems measure pressure relative to atmospheric pressure. If a patient is managed with their head tilted up, relative to their arterial pressure transducer, then a hydrostatic pressure gradient (HPG) can act against arterial pressure and cause significant errors in calculated CPP.To correct for HPG, the arterial pressure transducer should be placed level with the intracranial pressure transducer. However, this is not always achieved. In this chapter, we describe a pilot study investigating the application of speckled computing (or "specks") for the automatic monitoring of the patient's head tilt and subsequent automatic calculation of HPG. In future applications this will allow us to automatically correct CPP to take into account any HPG.

  2. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  3. Automatic luminous reflections detector using global threshold with increased luminosity contrast in images

    Science.gov (United States)

    Silva, Ricardo Petri; Naozuka, Gustavo Taiji; Mastelini, Saulo Martiello; Felinto, Alan Salvany

    2018-01-01

    The incidence of luminous reflections (LR) in captured images can interfere with the color of the affected regions. These regions tend to oversaturate, becoming whitish and, consequently, losing the original color information of the scene. Decision processes that employ images acquired from digital cameras can be impaired by the LR incidence. Such applications include real-time video surgeries, facial, and ocular recognition. This work proposes an algorithm called contrast enhancement of potential LR regions, which is a preprocessing to increase the contrast of potential LR regions, in order to improve the performance of automatic LR detectors. In addition, three automatic detectors were compared with and without the employment of our preprocessing method. The first one is a technique already consolidated in the literature called the Chang-Tseng threshold. We propose two automatic detectors called adapted histogram peak and global threshold. We employed four performance metrics to evaluate the detectors, namely, accuracy, precision, exactitude, and root mean square error. The exactitude metric is developed by this work. Thus, a manually defined reference model was created. The global threshold detector combined with our preprocessing method presented the best results, with an average exactitude rate of 82.47%.

  4. Type-assisted automatic garbage collection for lock-free data structures

    OpenAIRE

    Yang, Albert Mingkun; Wrigstad, Tobias

    2017-01-01

    We introduce Isolde, an automatic garbage collection scheme designed specifically for managing memory in lock-free data structures, such as stacks, lists, maps and queues. Isolde exists as a plug-in memory manager, designed to sit on-top of another memory manager, and use it's allocator and reclaimer (if exists). Isolde treats a lock-free data structure as a logical heap, isolated from the rest of the program. This allows garbage collection outside of Isolde to take place without affecting th...

  5. Automatic Welding System of Aluminum Pipe by Monitoring Backside Image of Molten Pool Using Vision Sensor

    Science.gov (United States)

    Baskoro, Ario Sunar; Kabutomori, Masashi; Suga, Yasuo

    An automatic welding system using Tungsten Inert Gas (TIG) welding with vision sensor for welding of aluminum pipe was constructed. This research studies the intelligent welding process of aluminum alloy pipe 6063S-T5 in fixed position and moving welding torch with the AC welding machine. The monitoring system consists of a vision sensor using a charge-coupled device (CCD) camera to monitor backside image of molten pool. The captured image was processed to recognize the edge of molten pool by image processing algorithm. Neural network model for welding speed control were constructed to perform the process automatically. From the experimental results it shows the effectiveness of the control system confirmed by good detection of molten pool and sound weld of experimental result.

  6. An effective attentional set for a specific colour does not prevent capture by infrequently presented motion distractors.

    Science.gov (United States)

    Retell, James D; Becker, Stefanie I; Remington, Roger W

    2016-01-01

    An organism's survival depends on the ability to rapidly orient attention to unanticipated events in the world. Yet, the conditions needed to elicit such involuntary capture remain in doubt. Especially puzzling are spatial cueing experiments, which have consistently shown that involuntary shifts of attention to highly salient distractors are not determined by stimulus properties, but instead are contingent on attentional control settings induced by task demands. Do we always need to be set for an event to be captured by it, or is there a class of events that draw attention involuntarily even when unconnected to task goals? Recent results suggest that a task-irrelevant event will capture attention on first presentation, suggesting that salient stimuli that violate contextual expectations might automatically capture attention. Here, we investigated the role of contextual expectation by examining whether an irrelevant motion cue that was presented only rarely (∼3-6% of trials) would capture attention when observers had an active set for a specific target colour. The motion cue had no effect when presented frequently, but when rare produced a pattern of interference consistent with attentional capture. The critical dependence on the frequency with which the irrelevant motion singleton was presented is consistent with early theories of involuntary orienting to novel stimuli. We suggest that attention will be captured by salient stimuli that violate expectations, whereas top-down goals appear to modulate capture by stimuli that broadly conform to contextual expectations.

  7. Content-aware automatic cropping for consumer photos

    Science.gov (United States)

    Tang, Hao; Tretter, Daniel; Lin, Qian

    2013-03-01

    Consumer photos are typically authored once, but need to be retargeted for reuse in various situations. These include printing a photo on different size paper, changing the size and aspect ratio of an embedded photo to accommodate the dynamic content layout of web pages or documents, adapting a large photo for browsing on small displays such as mobile phone screens, and improving the aesthetic quality of a photo that was badly composed at the capture time. In this paper, we propose a novel, effective, and comprehensive content-aware automatic cropping (hereafter referred to as "autocrop") method for consumer photos to achieve the above purposes. Our autocrop method combines the state-of-the-art context-aware saliency detection algorithm, which aims to infer the likely intent of the photographer, and the "branch-and-bound" efficient subwindow search optimization technique, which seeks to locate the globally optimal cropping rectangle in a fast manner. Unlike most current autocrop methods, which can only crop a photo into an arbitrary rectangle, our autocrop method can automatically crop a photo into either a rectangle of arbitrary dimensions or a rectangle of the desired aspect ratio specified by the user. The aggressiveness of the cropping operation may be either automatically determined by the method or manually indicated by the user with ease. In addition, our autocrop method is extended to support the cropping of a photo into non-rectangular shapes such as polygons of any number of sides. It may also be potentially extended to return multiple cropping suggestions, which will enable the creation of new photos to enrich the original photo collections. Our experimental results show that the proposed autocrop method in this paper can generate high-quality crops for consumer photos of various types.

  8. A study and implementation of algorithm for automatic ECT result comparison

    International Nuclear Information System (INIS)

    Jang, You Hyun; Nam, Min Woo; Kim, In Chul; Joo, Kyung Mun; Kim, Jong Seog

    2012-01-01

    Automatic ECT Result Comparison Algorithm was developed and implemented with computer language to remove the human error in manual comparison with many data. The structures of two ECT Program (Eddy net and ECT IDS) that have unique file structure were analyzed to open file and upload data in PC memory. Comparison algorithm was defined graphically for easy PC programming language conversion. Automatic Result Program was programmed with C language that is suitable for future code management and has object oriented programming structure and fast development potential. Automatic Result Program has MS Excel file exporting function that is useful to use external S/W for additional analysis and intuitive result visualization function with color mapping in user friendly fashion that helps analyze efficiently

  9. A study and implementation of algorithm for automatic ECT result comparison

    Energy Technology Data Exchange (ETDEWEB)

    Jang, You Hyun; Nam, Min Woo; Kim, In Chul; Joo, Kyung Mun; Kim, Jong Seog [Central Research Institute, Daejeon (Korea, Republic of)

    2012-10-15

    Automatic ECT Result Comparison Algorithm was developed and implemented with computer language to remove the human error in manual comparison with many data. The structures of two ECT Program (Eddy net and ECT IDS) that have unique file structure were analyzed to open file and upload data in PC memory. Comparison algorithm was defined graphically for easy PC programming language conversion. Automatic Result Program was programmed with C language that is suitable for future code management and has object oriented programming structure and fast development potential. Automatic Result Program has MS Excel file exporting function that is useful to use external S/W for additional analysis and intuitive result visualization function with color mapping in user friendly fashion that helps analyze efficiently.

  10. General considerations for neutron capture therapy at a reactor facility

    International Nuclear Information System (INIS)

    Binney, S.E.

    2001-01-01

    In addition to neutron beam intensity and quality, there are also a number of other significant criteria related to a nuclear reactor that contribute to a successful neutron capture therapy (NCT) facility. These criteria are classified into four main categories: Nuclear design factors, facility management and operations factors, facility resources, and non-technical factors. Important factors to consider are given for each of these categories. In addition to an adequate neutron beam intensity and quality, key requirements for a successful neutron capture therapy facility include necessary finances to construct or convert a facility for NCT, a capable medical staff to perform the NCT, and the administrative support for the facility. The absence of any one of these four factors seriously jeopardizes the overall probability of success of the facility. Thus nuclear reactor facility management considering becoming involved in neutron capture therapy, should it be proven clinically successful, should take all these factors into consideration. (author)

  11. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Directory of Open Access Journals (Sweden)

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  12. Integrating an Automatic Judge into an Open Source LMS

    Science.gov (United States)

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  13. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  14. Model of automatic fuel management for the Atucha II nuclear central with the PUMA IV code

    International Nuclear Information System (INIS)

    Marconi G, J.F.; Tarazaga, A.E.; Romero, L.D.

    2007-01-01

    The Atucha II central is a heavy water power station and natural uranium. For this reason and due to the first floor reactivity excess that have this type of reactors, it is necessary to carry out a continuous fuel management and with the central in power (for the case of Atucha II every 0.7 days approximately). To maintain in operation these centrals and to achieve a good fuels economy, different types of negotiate of fuels that include areas and roads where the fuels displace inside the core are proved; it is necessary to prove the great majority of these managements in long periods in order to corroborate the behavior of the power station and the burnt of extraction of the fuel elements. To carry out this work it is of great help that a program implements the approaches to continue in each replacement, using the roads and areas of each administration type to prove, and this way to obtain as results the one regulations execution in the time and the average burnt of extraction of the fuel elements, being fundamental this last data for the operator company of the power station. To carry out the previous work it is necessary that a physicist with experience in fuel management proves each one of the possible managements, even those that quickly can be discarded if its don't fulfill with the regulatory standards or its possess an average extraction burnt too much low. For this it is of fundamental help that with an automatic model the different administrations are proven and lastly the physicist analyzes the more important cases. The pattern in question not only allows to program different types of roads and areas of fuel management, but rather it also foresees the possibility to disable some of the approaches. (Author)

  15. Automatic crack detection and classification method for subway tunnel safety monitoring.

    Science.gov (United States)

    Zhang, Wenyu; Zhang, Zhenjiang; Qi, Dapeng; Liu, Yun

    2014-10-16

    Cracks are an important indicator reflecting the safety status of infrastructures. This paper presents an automatic crack detection and classification methodology for subway tunnel safety monitoring. With the application of high-speed complementary metal-oxide-semiconductor (CMOS) industrial cameras, the tunnel surface can be captured and stored in digital images. In a next step, the local dark regions with potential crack defects are segmented from the original gray-scale images by utilizing morphological image processing techniques and thresholding operations. In the feature extraction process, we present a distance histogram based shape descriptor that effectively describes the spatial shape difference between cracks and other irrelevant objects. Along with other features, the classification results successfully remove over 90% misidentified objects. Also, compared with the original gray-scale images, over 90% of the crack length is preserved in the last output binary images. The proposed approach was tested on the safety monitoring for Beijing Subway Line 1. The experimental results revealed the rules of parameter settings and also proved that the proposed approach is effective and efficient for automatic crack detection and classification.

  16. Automatic Crack Detection and Classification Method for Subway Tunnel Safety Monitoring

    Directory of Open Access Journals (Sweden)

    Wenyu Zhang

    2014-10-01

    Full Text Available Cracks are an important indicator reflecting the safety status of infrastructures. This paper presents an automatic crack detection and classification methodology for subway tunnel safety monitoring. With the application of high-speed complementary metal-oxide-semiconductor (CMOS industrial cameras, the tunnel surface can be captured and stored in digital images. In a next step, the local dark regions with potential crack defects are segmented from the original gray-scale images by utilizing morphological image processing techniques and thresholding operations. In the feature extraction process, we present a distance histogram based shape descriptor that effectively describes the spatial shape difference between cracks and other irrelevant objects. Along with other features, the classification results successfully remove over 90% misidentified objects. Also, compared with the original gray-scale images, over 90% of the crack length is preserved in the last output binary images. The proposed approach was tested on the safety monitoring for Beijing Subway Line 1. The experimental results revealed the rules of parameter settings and also proved that the proposed approach is effective and efficient for automatic crack detection and classification.

  17. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  18. Modeling and Capturing Users’ Actions in CSCL Systems for Collaboration Analysis Purposes

    Directory of Open Access Journals (Sweden)

    Manuel Ortega

    2009-03-01

    Full Text Available A significant number of CSCL (Computer-Supported Collaborative Learning environments support the learning of groups of students enabling their collaboration in solving problems. These collaborative environments usually need additional computational support to allow the automatic processing of both the actions carried out by the students and the end solution with the aim of studying the learning process and the validity of the solution proposed to the problem. This process, known as Collaboration and Interaction Analysis, is typically carried out in three phases: observation, abstraction and intervention. In this paper, we propose a methodological approach for the design of mechanisms for the observation phase. This approach provides a set of procedures enabling developers to design observation systems in CSCL environments that capture and model all the information required for comprehensive analyses of the collaboration process and the resulting solution to the problem. This methodological approach is put into practice by means of its use in the design of an observation system in the SPACE-DESIGN (SPecification and Automatic Construction of collaborative Environments of DESIGN collaborative environment.

  19. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [VTT Energy, Espoo (Finland); Hakola, T.; Antila, E. [ABB Power Oy, Helsinki (Finland); Seppaenen, M. [North-Carelian Power Company (Finland)

    1996-12-31

    In this presentation, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerised relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  20. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [VTT Energy, Espoo (Finland); Hakola, T; Antila, E [ABB Power Oy (Finland); Seppaenen, M [North-Carelian Power Company (Finland)

    1998-08-01

    In this chapter, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerized relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  1. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [VTT Energy, Espoo (Finland); Hakola, T; Antila, E [ABB Power Oy, Helsinki (Finland); Seppaenen, M [North-Carelian Power Company (Finland)

    1997-12-31

    In this presentation, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerised relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  2. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  3. CO 2 Capture from Dilute Gases as a Component of Modern Global Carbon Management

    KAUST Repository

    Jones, Christopher W.

    2011-01-01

    The growing atmospheric CO2 concentration and its impact on climate have motivated widespread research and development aimed at slowing or stemming anthropogenic carbon emissions. Technologies for carbon capture and sequestration (CCS) employing mass separating agents that extract and purify CO2 from flue gas emanating from large point sources such as fossil fuel-fired electricity-generating power plants are under development. Recent advances in solvents, adsorbents, and membranes for postcombust- ion CO 2 capture are described here. Specifically, room-temperature ionic liquids, supported amine materials, mixed matrix and facilitated transport membranes, and metal-organic framework materials are highlighted. In addition, the concept of extracting CO2 directly from ambient air (air capture) as a means of reducing the global atmospheric CO2 concentration is reviewed. For both conventional CCS from large point sources and air capture, critical research needs are identified and discussed. © Copyright 2011 by Annual Reviews. All rights reserved.

  4. CO 2 Capture from Dilute Gases as a Component of Modern Global Carbon Management

    KAUST Repository

    Jones, Christopher W.

    2011-07-15

    The growing atmospheric CO2 concentration and its impact on climate have motivated widespread research and development aimed at slowing or stemming anthropogenic carbon emissions. Technologies for carbon capture and sequestration (CCS) employing mass separating agents that extract and purify CO2 from flue gas emanating from large point sources such as fossil fuel-fired electricity-generating power plants are under development. Recent advances in solvents, adsorbents, and membranes for postcombust- ion CO 2 capture are described here. Specifically, room-temperature ionic liquids, supported amine materials, mixed matrix and facilitated transport membranes, and metal-organic framework materials are highlighted. In addition, the concept of extracting CO2 directly from ambient air (air capture) as a means of reducing the global atmospheric CO2 concentration is reviewed. For both conventional CCS from large point sources and air capture, critical research needs are identified and discussed. © Copyright 2011 by Annual Reviews. All rights reserved.

  5. Evaluation of XHVRB for Capturing Explosive Shock Desensitization

    Science.gov (United States)

    Tuttle, Leah; Schmitt, Robert; Kittell, Dave; Harstad, Eric

    2017-06-01

    Explosive shock desensitization phenomena have been recognized for some time. It has been demonstrated that pressure-based reactive flow models do not adequately capture the basic nature of the explosive behavior. Historically, replacing the local pressure with a shock captured pressure has dramatically improved the numerical modeling approaches. Models based upon shock pressure or functions of entropy have recently been developed. A pseudo-entropy based formulation using the History Variable Reactive Burn model, as proposed by Starkenberg, was implemented into the Eulerian shock physics code CTH. Improvements in the shock capturing algorithm were made. The model is demonstrated to reproduce single shock behavior consistent with published pop plot data. It is also demonstrated to capture a desensitization effect based on available literature data, and to qualitatively capture dead zones from desensitization in 2D corner turning experiments. This models shows promise for use in modeling and simulation problems that are relevant to the desensitization phenomena. Issues are identified with the current implementation and future work is proposed for improving and expanding model capabilities. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DOE's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  6. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  7. A Novel Cascade Classifier for Automatic Microcalcification Detection.

    Directory of Open Access Journals (Sweden)

    Seung Yeon Shin

    Full Text Available In this paper, we present a novel cascaded classification framework for automatic detection of individual and clusters of microcalcifications (μC. Our framework comprises three classification stages: i a random forest (RF classifier for simple features capturing the second order local structure of individual μCs, where non-μC pixels in the target mammogram are efficiently eliminated; ii a more complex discriminative restricted Boltzmann machine (DRBM classifier for μC candidates determined in the RF stage, which automatically learns the detailed morphology of μC appearances for improved discriminative power; and iii a detector to detect clusters of μCs from the individual μC detection results, using two different criteria. From the two-stage RF-DRBM classifier, we are able to distinguish μCs using explicitly computed features, as well as learn implicit features that are able to further discriminate between confusing cases. Experimental evaluation is conducted on the original Mammographic Image Analysis Society (MIAS and mini-MIAS databases, as well as our own Seoul National University Bundang Hospital digital mammographic database. It is shown that the proposed method outperforms comparable methods in terms of receiver operating characteristic (ROC and precision-recall curves for detection of individual μCs and free-response receiver operating characteristic (FROC curve for detection of clustered μCs.

  8. Development of Portable Automatic Number Plate Recognition System on Android Mobile Phone

    Science.gov (United States)

    Mutholib, Abdul; Gunawan, Teddy S.; Chebil, Jalel; Kartiwi, Mira

    2013-12-01

    The Automatic Number Plate Recognition (ANPR) System has performed as the main role in various access control and security, such as: tracking of stolen vehicles, traffic violations (speed trap) and parking management system. In this paper, the portable ANPR implemented on android mobile phone is presented. The main challenges in mobile application are including higher coding efficiency, reduced computational complexity, and improved flexibility. Significance efforts are being explored to find suitable and adaptive algorithm for implementation of ANPR on mobile phone. ANPR system for mobile phone need to be optimize due to its limited CPU and memory resources, its ability for geo-tagging image captured using GPS coordinates and its ability to access online database to store the vehicle's information. In this paper, the design of portable ANPR on android mobile phone will be described as follows. First, the graphical user interface (GUI) for capturing image using built-in camera was developed to acquire vehicle plate number in Malaysia. Second, the preprocessing of raw image was done using contrast enhancement. Next, character segmentation using fixed pitch and an optical character recognition (OCR) using neural network were utilized to extract texts and numbers. Both character segmentation and OCR were using Tesseract library from Google Inc. The proposed portable ANPR algorithm was implemented and simulated using Android SDK on a computer. Based on the experimental results, the proposed system can effectively recognize the license plate number at 90.86%. The required processing time to recognize a license plate is only 2 seconds on average. The result is consider good in comparison with the results obtained from previous system that was processed in a desktop PC with the range of result from 91.59% to 98% recognition rate and 0.284 second to 1.5 seconds recognition time.

  9. Automatic registration method for multisensor datasets adopted for dimensional measurements on cutting tools

    International Nuclear Information System (INIS)

    Shaw, L; Mehari, F; Weckenmann, A; Ettl, S; Häusler, G

    2013-01-01

    Multisensor systems with optical 3D sensors are frequently employed to capture complete surface information by measuring workpieces from different views. During coarse and fine registration the resulting datasets are afterward transformed into one common coordinate system. Automatic fine registration methods are well established in dimensional metrology, whereas there is a deficit in automatic coarse registration methods. The advantage of a fully automatic registration procedure is twofold: it enables a fast and contact-free alignment and further a flexible application to datasets of any kind of optical 3D sensor. In this paper, an algorithm adapted for a robust automatic coarse registration is presented. The method was originally developed for the field of object reconstruction or localization. It is based on a segmentation of planes in the datasets to calculate the transformation parameters. The rotation is defined by the normals of three corresponding segmented planes of two overlapping datasets, while the translation is calculated via the intersection point of the segmented planes. First results have shown that the translation is strongly shape dependent: 3D data of objects with non-orthogonal planar flanks cannot be registered with the current method. In the novel supplement for the algorithm, the translation is additionally calculated via the distance between centroids of corresponding segmented planes, which results in more than one option for the transformation. A newly introduced measure considering the distance between the datasets after coarse registration evaluates the best possible transformation. Results of the robust automatic registration method are presented on the example of datasets taken from a cutting tool with a fringe-projection system and a focus-variation system. The successful application in dimensional metrology is proven with evaluations of shape parameters based on the registered datasets of a calibrated workpiece. (paper)

  10. Automatic delineation of geomorphological slope units with r.slopeunits v1.0 and their optimization for landslide susceptibility modeling

    Directory of Open Access Journals (Sweden)

    M. Alvioli

    2016-11-01

    Full Text Available Automatic subdivision of landscapes into terrain units remains a challenge. Slope units are terrain units bounded by drainage and divide lines, but their use in hydrological and geomorphological studies is limited because of the lack of reliable software for their automatic delineation. We present the r.slopeunits software for the automatic delineation of slope units, given a digital elevation model and a few input parameters. We further propose an approach for the selection of optimal parameters controlling the terrain subdivision for landslide susceptibility modeling. We tested the software and the optimization approach in central Italy, where terrain, landslide, and geo-environmental information was available. The software was capable of capturing the variability of the landscape and partitioning the study area into slope units suited for landslide susceptibility modeling and zonation. We expect r.slopeunits to be used in different physiographical settings for the production of reliable and reproducible landslide susceptibility zonations.

  11. Dynamics and control of robot for capturing objects in space

    Science.gov (United States)

    Huang, Panfeng

    . After capturing the object, the space robot must complete the following two tasks: one is to berth the object, and the other is to re-orientate the attitude of the whole robot system for communication and power supply. Therefore, I propose a method to accomplish these two tasks simultaneously using manipulator motion only. The ultimate goal of space services is to realize the capture and manipulation autonomously. Therefore, I propose an affective approach based on learning human skill to track and capture the objects automatically in space. With human-teaching demonstration, the space robot is able to learn and abstract human tracking and capturing skill using an efficient neural-network learning architecture that combines flexible Cascade Neural Networks with Node Decoupled Extended Kalman Filtering (CNN-NDEKF). The simulation results attest that this approach is useful and feasible in tracking trajectory planning and capturing of space robot. Finally I propose a novel approach based on Genetic Algorithms (GAs) to optimize the approach trajectory of space robots in order to realize effective and stable operations. I complete the minimum-torque path planning in order to save the limited energy in space, and design the minimum jerk trajectory for the stabilization of the space manipulator and its space base. These optimal algorithms are very important and useful for the application of space robot.

  12. ADJUST: An automatic EEG artifact detector based on the joint use of spatial and temporal features.

    Science.gov (United States)

    Mognon, Andrea; Jovicich, Jorge; Bruzzone, Lorenzo; Buiatti, Marco

    2011-02-01

    A successful method for removing artifacts from electroencephalogram (EEG) recordings is Independent Component Analysis (ICA), but its implementation remains largely user-dependent. Here, we propose a completely automatic algorithm (ADJUST) that identifies artifacted independent components by combining stereotyped artifact-specific spatial and temporal features. Features were optimized to capture blinks, eye movements, and generic discontinuities on a feature selection dataset. Validation on a totally different EEG dataset shows that (1) ADJUST's classification of independent components largely matches a manual one by experts (agreement on 95.2% of the data variance), and (2) Removal of the artifacted components detected by ADJUST leads to neat reconstruction of visual and auditory event-related potentials from heavily artifacted data. These results demonstrate that ADJUST provides a fast, efficient, and automatic way to use ICA for artifact removal. Copyright © 2010 Society for Psychophysiological Research.

  13. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  14. Automatic ultrasound technique to measure angle of progression during labor.

    Science.gov (United States)

    Conversano, F; Peccarisi, M; Pisani, P; Di Paola, M; De Marco, T; Franchini, R; Greco, A; D'Ambrogio, G; Casciaro, S

    2017-12-01

    To evaluate the accuracy and reliability of an automatic ultrasound technique for assessment of the angle of progression (AoP) during labor. Thirty-nine pregnant women in the second stage of labor, with fetus in cephalic presentation, underwent conventional labor management with additional translabial sonographic examination. AoP was measured in a total of 95 acquisition sessions, both automatically by an innovative algorithm and manually by an experienced sonographer, who was blinded to the algorithm outcome. The results obtained from the manual measurement were used as the reference against which the performance of the algorithm was assessed. In order to overcome the common difficulties encountered when visualizing by sonography the pubic symphysis, the AoP was measured by considering as the symphysis landmark its centroid rather than its distal point, thereby assuring high measurement reliability and reproducibility, while maintaining objectivity and accuracy in the evaluation of progression of labor. There was a strong and statistically significant correlation between AoP values measured by the algorithm and the reference values (r = 0.99, P < 0.001). The high accuracy provided by the automatic method was also highlighted by the corresponding high values of the coefficient of determination (r 2  = 0.98) and the low residual errors (root mean square error = 2°27' (2.1%)). The global agreement between the two methods, assessed through Bland-Altman analysis, resulted in a negligible mean difference of 1°1' (limits of agreement, 4°29'). The proposed automatic algorithm is a reliable technique for measurement of the AoP. Its (relative) operator-independence has the potential to reduce human errors and speed up ultrasound acquisition time, which should facilitate management of women during labor. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd.

  15. Fully automatic CNC machining production system

    Directory of Open Access Journals (Sweden)

    Lee Jeng-Dao

    2017-01-01

    Full Text Available Customized manufacturing is increasing years by years. The consumption habits change has been cause the shorter of product life cycle. Therefore, many countries view industry 4.0 as a target to achieve more efficient and more flexible automated production. To develop an automatic loading and unloading CNC machining system via vision inspection is the first step in industrial upgrading. CNC controller is adopted as the main controller to command to the robot, conveyor, and other equipment in this study. Moreover, machine vision systems are used to detect position of material on the conveyor and the edge of the machining material. In addition, Open CNC and SCADA software will be utilized to make real-time monitor, remote system of control, alarm email notification, and parameters collection. Furthermore, RFID has been added to employee classification and management. The machine handshaking has been successfully proposed to achieve automatic vision detect, edge tracing measurement, machining and system parameters collection for data analysis to accomplish industrial automation system integration with real-time monitor.

  16. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  17. Outcomes important to burns patients during scar management and how they compare to the concepts captured in burn-specific patient reported outcome measures.

    Science.gov (United States)

    Jones, Laura L; Calvert, Melanie; Moiemen, Naiem; Deeks, Jonathan J; Bishop, Jonathan; Kinghorn, Philip; Mathers, Jonathan

    2017-12-01

    Pressure garment therapy (PGT) is an established treatment for the prevention and treatment of hypertrophic scarring; however, there is limited evidence for its effectiveness. Burn survivors often experience multiple issues many of which are not adequately captured in current PGT trial measures. To assess the effectiveness of PGT it is important to understand what outcomes matter to patients and to consider whether patient-reported outcome measures (PROMs) can be used to ascertain the effect of treatments on patients' health-related quality of life. This study aimed to (a) understand the priorities and perspectives of adult burns patients and the parents of burns patients who have experienced PGT via in-depth qualitative data, and (b) compare these with the concepts captured within burn-specific PROMs. We undertook 40 semi-structured interviews with adults and parents of paediatric and adolescent burns patients who had experienced PGT to explore their priorities and perspectives on scar management. Interviews were audio-recorded, transcribed and thematically analysed. The outcomes interpreted within the interview data were then mapped against the concepts captured within burn-specific PROMs currently in the literature. Eight core outcome domains were identified as important to adult patients and parents: (1) scar characteristics and appearance, (2) movement and function, (3) scar sensation, (4) psychological distress, adjustments and a sense of normality, (5) body image and confidence, (6) engagement in activities, (7) impact on relationships, and (8) treatment burden. The outcome domains presented reflect a complex holistic patient experience of scar management and treatments such as PGT. Some currently available PROMs do capture the concepts described here, although none assess psychological adjustments and attainment of a sense of normality following burn injury. The routine use of PROMs that represent patient experience and their relative contribution to trial

  18. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  19. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  20. [Work accidents and automatic circuit reclosers in the electricity sector: beyond the immediate causes].

    Science.gov (United States)

    Silva, Alessandro Jose Nunes da; Almeida, Ildeberto Muniz de; Vilela, Rodolfo Andrade de Gouveia; Mendes, Renata Wey Berti; Hurtado, Sandra Lorena Beltran

    2018-05-10

    The Brazilian electricity sector has recorded high work-related mortality rates that have been associated with outsourcing, used to cut costs. In order to decrease the power outage time for consumers, the industry adopted the automatic circuit recloser as the technical solution. The device has hazardous implications for maintenance workers. The aim of this study was to analyze the origins and consequences of work accidents in power systems with automatic circuit recloser, using the Accident Analysis and Prevention (AAP) model. The AAP model was used to investigate two work accidents, aimed to explore the events' organizational origins. Case 1 - when changing a deenergized secondary line, a worker received a shock from the energized primary cable (13.8kV). The system reclosed three times, causing severe injury to the worker (amputation of a lower limb). Case 2 - a fatal work accident occurred during installation of a new crosshead on a partially insulated energized line. The tip of a metal cross arm section strap touched the energized secondary line and electrocuted the maintenance operator. The circuit breaker component of the automatic circuit recloser failed. The analyses revealed how business management logic can participate in the root causes of work accidents through failures in maintenance management, outsourced workforce management, and especially safety management in systems with reclosers. Decisions to adopt automation to guarantee power distribution should not overlook the risks to workers in overhead power lines or fail to acknowledge the importance of ensuring safe conditions.

  1. Automatic Management Systems for the Operation of the Cryogenic Test Facilities for LHC Series Superconducting Magnets

    CERN Document Server

    Tovar-Gonzalez, A; Herblin, L; Lamboy, J P; Vullierme, B

    2006-01-01

    Prior to their final preparation before installation in the tunnel, the ~1800 series superconducting magnets of the LHC machine shall be entirely tested at reception on modular test facilities. The operation 24 hours per day of the cryogenic test facilities is conducted in turn by 3-operator teams, assisted in real time by the use of the Test Bench Priorities Handling System, a process control application enforcing the optimum use of cryogenic utilities and of the "Tasks Tracking System", a web-based e-traveller application handling 12 parallel 38-task test sequences. This paper describes how such computer-based management systems can be used to optimize operation of concurrent test benches within technical boundary conditions given by the cryogenic capacity, and how they can be used to study the efficiency of the automatic steering of all individual cryogenic sub-systems. Finally, this paper presents the overall performance of the cryomagnet test station for the first complete year of operation at high produ...

  2. Decision support for mastitis on farms with an automatic milking system

    NARCIS (Netherlands)

    Steeneveld, W.

    2010-01-01

    For an optimal mastitis management on farms with an automatic milking system (AMS), two individual cow decisions are important. First, there is a need for decision support on which mastitis alerts have the highest priority for visual checking for clinical mastitis (CM). In essence, all cows with

  3. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  4. AUTO-LAY: automatic layout generation for procedure flow diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Forzano, P; Castagna, P [Ansaldo SpA, Genoa (Italy)

    1996-12-31

    Nuclear Power Plant Procedures can be seen from essentially two viewpoints: the process and the information management. From the first point of view, it is important to supply the knowledge apt to solve problems connected with the control of the process, from the second one the focus of attention is on the knowledge representation, its structure, elicitation and maintenance, formal quality assurance. These two aspects of procedure representation can be considered and solved separately. In particular, methodological, formal and management issues require long and tedious activities, that in most cases constitute a great barrier for procedures development and upgrade. To solve these problems, Ansaldo is developing DIAM, a wide integrated tool for procedure management to support in procedure writing, updating, usage and documentation. One of the most challenging features of DIAM is AUTO-LAY, a CASE sub-tool that, in a complete automatical way, structures parts or complete flow diagrams. This is a feature that is partially present in some other CASE products, that, anyway, do not allow complex graph handling and isomorphism between video and paper representation AUTO-LAY has the unique prerogative to draw graphs of any complexity, to section them in pages, and to automatically compose a document. This has been recognized in the literature as the most important second-generation CASE improvement. (author). 5 refs., 9 figs.

  5. AUTO-LAY: automatic layout generation for procedure flow diagrams

    International Nuclear Information System (INIS)

    Forzano, P.; Castagna, P.

    1995-01-01

    Nuclear Power Plant Procedures can be seen from essentially two viewpoints: the process and the information management. From the first point of view, it is important to supply the knowledge apt to solve problems connected with the control of the process, from the second one the focus of attention is on the knowledge representation, its structure, elicitation and maintenance, formal quality assurance. These two aspects of procedure representation can be considered and solved separately. In particular, methodological, formal and management issues require long and tedious activities, that in most cases constitute a great barrier for procedures development and upgrade. To solve these problems, Ansaldo is developing DIAM, a wide integrated tool for procedure management to support in procedure writing, updating, usage and documentation. One of the most challenging features of DIAM is AUTO-LAY, a CASE sub-tool that, in a complete automatical way, structures parts or complete flow diagrams. This is a feature that is partially present in some other CASE products, that, anyway, do not allow complex graph handling and isomorphism between video and paper representation AUTO-LAY has the unique prerogative to draw graphs of any complexity, to section them in pages, and to automatically compose a document. This has been recognized in the literature as the most important second-generation CASE improvement. (author). 5 refs., 9 figs

  6. Optional carbon capture

    Energy Technology Data Exchange (ETDEWEB)

    Alderson, T.; Scott, S.; Griffiths, J. [Jacobs Engineering, London (United Kingdom)

    2007-07-01

    In the case of IGCC power plants, carbon capture can be carried out before combustion. The carbon monoxide in the syngas is catalytically shifted to carbon dioxide and then captured in a standard gas absorption system. However, the insertion of a shift converter into an existing IGCC plant with no shift would mean a near total rebuild of the gasification waste heat recovery, gas treatment system and HRSG, with only the gasifier and gas turbine retaining most of their original features. To reduce the extent, cost and time taken for the revamping, the original plant could incorporate the shift, and the plant would then be operated without capture to advantage, and converted to capture mode of operation when commercially appropriate. This paper examines this concept of placing a shift converter into an IGCC plant before capture is required, and operating the same plant first without and then later with CO{sub 2} capture in a European context. The advantages and disadvantages of this 'capture ready' option are discussed. 6 refs., 2 figs., 4 tabs.

  7. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  8. Gravitational capture

    International Nuclear Information System (INIS)

    Bondi, H.

    1979-01-01

    In spite of the strength of gravitational focres between celestial bodies, gravitational capture is not a simple concept. The principles of conservation of linear momentum and of conservation of angular momentum, always impose severe constraints, while conservation of energy and the vital distinction between dissipative and non-dissipative systems allows one to rule out capture in a wide variety of cases. In complex systems especially those without dissipation, long dwell time is a more significant concept than permanent capture. (author)

  9. Automatic Control Systems (ACS for Generation and Sale of Electric Power Under Conditions of Industry-Sector Liberalization

    Directory of Open Access Journals (Sweden)

    Yu. S. Petrusha

    2013-01-01

    Full Text Available Possible risks pertaining to transition of electric-power industry to market relations have been considered in the paper. The paper presents an integrated ACS for generation and sale of electric power as an improvement of methodology for organizational and technical management. The given system is based on integration of operating Automatic Dispatch Control System (ADCS and developing Automatic Electricity Meter Reading System (AEMRS. The paper proposes to form an inter-branch sector of ACS PLC (Automatic Control System for Prolongation of Life Cycle users which is oriented on provision of development strategy.

  10. [Discussion on developing a data management plan and its key factors in clinical study based on electronic data capture system].

    Science.gov (United States)

    Li, Qing-na; Huang, Xiu-ling; Gao, Rui; Lu, Fang

    2012-08-01

    Data management has significant impact on the quality control of clinical studies. Every clinical study should have a data management plan to provide overall work instructions and ensure that all of these tasks are completed according to the Good Clinical Data Management Practice (GCDMP). Meanwhile, the data management plan (DMP) is an auditable document requested by regulatory inspectors and must be written in a manner that is realistic and of high quality. The significance of DMP, the minimum standards and the best practices provided by GCDMP, the main contents of DMP based on electronic data capture (EDC) and some key factors of DMP influencing the quality of clinical study were elaborated in this paper. Specifically, DMP generally consists of 15 parts, namely, the approval page, the protocol summary, role and training, timelines, database design, creation, maintenance and security, data entry, data validation, quality control and quality assurance, the management of external data, serious adverse event data reconciliation, coding, database lock, data management reports, the communication plan and the abbreviated terms. Among them, the following three parts are regarded as the key factors: designing a standardized database of the clinical study, entering data in time and cleansing data efficiently. In the last part of this article, the authors also analyzed the problems in clinical research of traditional Chinese medicine using the EDC system and put forward some suggestions for improvement.

  11. Capture ready study

    Energy Technology Data Exchange (ETDEWEB)

    Minchener, A.

    2007-07-15

    There are a large number of ways in which the capture of carbon as carbon dioxide (CO{sub 2}) can be integrated into fossil fuel power stations, most being applicable for both gas and coal feedstocks. To add to the choice of technology is the question of whether an existing plant should be retrofitted for capture, or whether it is more attractive to build totally new. This miscellany of choices adds considerably to the commercial risk of investing in a large power station. An intermediate stage between the non-capture and full capture state would be advantageous in helping to determine the best way forward and hence reduce those risks. In recent years the term 'carbon capture ready' or 'capture ready' has been coined to describe such an intermediate stage plant and is now widely used. However a detailed and all-encompassing definition of this term has never been published. All fossil fuel consuming plant produce a carbon dioxide gas byproduct. There is a possibility of scrubbing it with an appropriate CO{sub 2} solvent. Hence it could be said that all fossil fuel plant is in a condition for removal of its CO{sub 2} effluent and therefore already in a 'capture ready' state. Evidently, the practical reality of solvent scrubbing could cost more than the rewards offered by such as the ETS (European Trading Scheme). In which case, it can be said that although the possibility exists of capturing CO{sub 2}, it is not a commercially viable option and therefore the plant could not be described as ready for CO{sub 2} capture. The boundary between a capture ready and a non-capture ready condition using this definition cannot be determined in an objective and therefore universally acceptable way and criteria must be found which are less onerous and less potentially contentious to assess. 16 refs., 2 annexes.

  12. THE ACCURACY OF AUTOMATIC PHOTOGRAMMETRIC TECHNIQUES ON ULTRA-LIGHT UAV IMAGERY

    Directory of Open Access Journals (Sweden)

    O. Küng

    2012-09-01

    Full Text Available This paper presents an affordable, fully automated and accurate mapping solutions based on ultra-light UAV imagery. Several datasets are analysed and their accuracy is estimated. We show that the accuracy highly depends on the ground resolution (flying height of the input imagery. When chosen appropriately this mapping solution can compete with traditional mapping solutions that capture fewer high-resolution images from airplanes and that rely on highly accurate orientation and positioning sensors on board. Due to the careful integration with recent computer vision techniques, the post processing is robust and fully automatic and can deal with inaccurate position and orientation information which are typically problematic with traditional techniques.

  13. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  14. Neuroinformatics Software Applications Supporting Electronic Data Capture, Management, and Sharing for the Neuroimaging Community.

    Science.gov (United States)

    Nichols, B Nolan; Pohl, Kilian M

    2015-09-01

    Accelerating insight into the relation between brain and behavior entails conducting small and large-scale research endeavors that lead to reproducible results. Consensus is emerging between funding agencies, publishers, and the research community that data sharing is a fundamental requirement to ensure all such endeavors foster data reuse and fuel reproducible discoveries. Funding agency and publisher mandates to share data are bolstered by a growing number of data sharing efforts that demonstrate how information technologies can enable meaningful data reuse. Neuroinformatics evaluates scientific needs and develops solutions to facilitate the use of data across the cognitive and neurosciences. For example, electronic data capture and management tools designed to facilitate human neurocognitive research can decrease the setup time of studies, improve quality control, and streamline the process of harmonizing, curating, and sharing data across data repositories. In this article we outline the advantages and disadvantages of adopting software applications that support these features by reviewing the tools available and then presenting two contrasting neuroimaging study scenarios in the context of conducting a cross-sectional and a multisite longitudinal study.

  15. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    Science.gov (United States)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long

  17. Design of a computerized application for the quality control of film automatic processors

    International Nuclear Information System (INIS)

    Merillas del Castillo, A.; Guibelalde del Castillo, E.; Fernandez Soto, J.M.; Vano carruana, E.

    1997-01-01

    The description of a free ware software for quality control of film automatic processors in diagnostic radiology developed by the authors is presented (CC-PRO ver 1.0). The application has been developed for using with automatic scanning densitometers, type X-Rite 380, being also able for manual data acquisition. By means of standard sensitometric techniques and the thus developed software, the trend analysis of the sensitometric variables and the film processor diagnostic could be carried out with an important production improvement, easy management and test consistency. (Author) 6 refs

  18. An intelligent system for real time automatic defect inspection on specular coated surfaces

    Science.gov (United States)

    Li, Jinhua; Parker, Johné M.; Hou, Zhen

    2005-07-01

    Product visual inspection is still performed manually or semi automatically in most industries from simple ceramic tile grading to complex automotive body panel paint defect and surface quality inspection. Moreover, specular surfaces present additional challenge to conventional vision systems due to specular reflections, which may mask the true location of objects and lead to incorrect measurements. There are some sophisticated visual inspection methods developed in recent years. Unfortunately, most of them are highly computational. Systems built on those methods are either inapplicable or very costly to achieve real time inspection. In this paper, we describe an integrated low-cost intelligent system developed to automatically capture, extract, and segment defects on specular surfaces with uniform color coatings. The system inspects and locates regular surface defects with lateral dimensions as small as a millimeter. The proposed system is implemented on a group of smart cameras using its on-board processing ability to achieve real time inspection. The experimental results on real test panels demonstrate the effectiveness and robustness of proposed system.

  19. Automatic generation of statistical pose and shape models for articulated joints.

    Science.gov (United States)

    Xin Chen; Graham, Jim; Hutchinson, Charles; Muir, Lindsay

    2014-02-01

    Statistical analysis of motion patterns of body joints is potentially useful for detecting and quantifying pathologies. However, building a statistical motion model across different subjects remains a challenging task, especially for a complex joint like the wrist. We present a novel framework for simultaneous registration and segmentation of multiple 3-D (CT or MR) volumes of different subjects at various articulated positions. The framework starts with a pose model generated from 3-D volumes captured at different articulated positions of a single subject (template). This initial pose model is used to register the template volume to image volumes from new subjects. During this process, the Grow-Cut algorithm is used in an iterative refinement of the segmentation of the bone along with the pose parameters. As each new subject is registered and segmented, the pose model is updated, improving the accuracy of successive registrations. We applied the algorithm to CT images of the wrist from 25 subjects, each at five different wrist positions and demonstrated that it performed robustly and accurately. More importantly, the resulting segmentations allowed a statistical pose model of the carpal bones to be generated automatically without interaction. The evaluation results show that our proposed framework achieved accurate registration with an average mean target registration error of 0.34 ±0.27 mm. The automatic segmentation results also show high consistency with the ground truth obtained semi-automatically. Furthermore, we demonstrated the capability of the resulting statistical pose and shape models by using them to generate a measurement tool for scaphoid-lunate dissociation diagnosis, which achieved 90% sensitivity and specificity.

  20. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  1. Key Management in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ismail Mansour

    2015-09-01

    Full Text Available Wireless sensor networks are a challenging field of research when it comes to security issues. Using low cost sensor nodes with limited resources makes it difficult for cryptographic algorithms to function without impacting energy consumption and latency. In this paper, we focus on key management issues in multi-hop wireless sensor networks. These networks are easy to attack due to the open nature of the wireless medium. Intruders could try to penetrate the network, capture nodes or take control over particular nodes. In this context, it is important to revoke and renew keys that might be learned by malicious nodes. We propose several secure protocols for key revocation and key renewal based on symmetric encryption and elliptic curve cryptography. All protocols are secure, but have different security levels. Each proposed protocol is formally proven and analyzed using Scyther, an automatic verification tool for cryptographic protocols. For efficiency comparison sake, we implemented all protocols on real testbeds using TelosB motes and discussed their performances.

  2. Improvement The Transmission Efficiency For Wireless Packet Communication Systems Using Automatic Control for power And Time Slot Width Of Slotted Non persistent ISMA Protocol

    Directory of Open Access Journals (Sweden)

    Saad M. Hardan

    2013-05-01

    Full Text Available In packed communication systems which use a protocol, the protocol should perform the allocation of channels such that the transmission channel is used efficiently. Efficiency is usually measured in terms of channel throughput and the average transmission  delay. The Slotted Nonpersistent ISMA protocol is one of random access protocols used in packed communication systems. In this research a Slotted Nonpersistent ISMA protocol Model with automatic control for power and time slot is proposed. the suggested algorithm enable the base station(access point to control  the protocol time slot length and  transmission power in a dynamic way to control the normalized propagation delay d and to maintain all the uplink signals in the limit of captured power threshold (capture ratio in order to control the  throughput and the average transmission delay of the communication system by an automatic method. the computer simulation results  confirm the activity of the  proposed algorithm for increasing the  throughput and decreasing the average transmission delay by an accepted ratios.

  3. Radiative electron capture

    International Nuclear Information System (INIS)

    Biggerstaff, J.A.; Appleton, B.R.; Datz, S.; Moak, C.D.; Neelavathi, V.N.; Noggle, T.S.; Ritchie, R.H.; VerBeek, H.

    1975-01-01

    Some data are presented for radiative electron capture by fast moving ions. The radiative electron capture spectrum is shown for O 8+ in Ag, along with the energy dependence of the capture cross-section. A discrepancy between earlier data, theoretical prediction, and the present data is pointed out. (3 figs) (U.S.)

  4. Lecture Capture Technology and Student Performance in an Operations Management Course

    Science.gov (United States)

    Sloan, Thomas W.; Lewis, David A.

    2014-01-01

    Lecture capture technologies (LCT) such as Echo360, Mediasite, and Tegrity have become very popular in recent years. Many studies have shown that students favor the use of such technology, but relatively little research has studied the impact of LCT on learning. This article examines two research questions: (1) whether the use of LCT actually…

  5. Automatic identification of otologic drilling faults: a preliminary report.

    Science.gov (United States)

    Shen, Peng; Feng, Guodong; Cao, Tianyang; Gao, Zhiqiang; Li, Xisheng

    2009-09-01

    A preliminary study was carried out to identify parameters to characterize drilling faults when using an otologic drill under various operating conditions. An otologic drill was modified by the addition of four sensors. Under consistent conditions, the drill was used to simulate three important types of drilling faults and the captured data were analysed to extract characteristic signals. A multisensor information fusion system was designed to fuse the signals and automatically identify the faults. When identifying drilling faults, there was a high degree of repeatability and regularity, with an average recognition rate of >70%. This study shows that the variables measured change in a fashion that allows the identification of particular drilling faults, and that it is feasible to use these data to provide rapid feedback for a control system. Further experiments are being undertaken to implement such a system.

  6. Convective Heat Transfer Coefficients of Automatic Transmission Fluid Jets with Implications for Electric Machine Thermal Management: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Bennion, Kevin; Moreno, Gilberto

    2015-09-29

    Thermal management for electric machines (motors/ generators) is important as the automotive industry continues to transition to more electrically dominant vehicle propulsion systems. Cooling of the electric machine(s) in some electric vehicle traction drive applications is accomplished by impinging automatic transmission fluid (ATF) jets onto the machine's copper windings. In this study, we provide the results of experiments characterizing the thermal performance of ATF jets on surfaces representative of windings, using Ford's Mercon LV ATF. Experiments were carried out at various ATF temperatures and jet velocities to quantify the influence of these parameters on heat transfer coefficients. Fluid temperatures were varied from 50 degrees C to 90 degrees C to encompass potential operating temperatures within an automotive transaxle environment. The jet nozzle velocities were varied from 0.5 to 10 m/s. The experimental ATF heat transfer coefficient results provided in this report are a useful resource for understanding factors that influence the performance of ATF-based cooling systems for electric machines.

  7. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  8. SU-E-T-253: Open-Source Automatic Software for Quantifying Biological Assays of Radiation Effects

    International Nuclear Information System (INIS)

    Detappe, A; Korideck, H; Makrigiorgos, G; Berbeco, R

    2014-01-01

    Purpose: Clonogenic cell survival is a common assay for quantifying the effect of drugs and radiation. Manual counting of surviving colonies can take 30–90seconds per plate, a major limitation for large studies. Currently available automatic counting tools are not easily modified for radiation oncology research. Our goal is to provide an open-source toolkit for precise, accurate and fast analysis of biological assays in radiation oncology. Methods: As an example analysis, we used HeLa cells incubated with gadolinium nanoparticles prior to irradiation. After treatment, the cells are grown for 14days to allow for colony formation. To analyze the colony growth, we capture images of each dish for archiving and automatic computer-based analysis. A FujifilmX20 camera is placed at the top of a box setup, 20cm above the sample, which is backlit by a LED lamp placed at the bottom of the box. We use a Gaussian filter (width=1.3mm) and color threshold (19–255). The minimum size for a colony to be counted is 1mm. For this example, 20 dishes with a large range of colonies were analyzed. Each dish was counted 3 times manually by 3 different users and then compared to our counter. Results: Automatic counting of cell colonies takes an average of 7seconds, enabling the analysis process to be accelerated 4–12 times. The average precision of the automatic counter was 1.7%. The Student t-test demonstrated the non-significant differences between the two counting methods (p=0.64). The ICC demonstrated the reliability of each method with ICC>0.999 (automatic) and ICC=0.95 (manual). Conclusion: We developed an open-source automatic toolkit for the analysis of biological assays in radiation oncology and demonstrated the accuracy, precision and effort savings for clonogenic cell survival quantification. This toolkit is currently being used in two laboratories for routine experimental analysis and will be made freely available on our departmental website

  9. Automatic categorization of diverse experimental information in the bioscience literature.

    Science.gov (United States)

    Fang, Ruihua; Schindelman, Gary; Van Auken, Kimberly; Fernandes, Jolene; Chen, Wen; Wang, Xiaodong; Davis, Paul; Tuli, Mary Ann; Marygold, Steven J; Millburn, Gillian; Matthews, Beverley; Zhang, Haiyan; Brown, Nick; Gelbart, William M; Sternberg, Paul W

    2012-01-26

    Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at WormBase for

  10. An Automatic Decision-Making Mechanism for Virtual Machine Live Migration in Private Clouds

    Directory of Open Access Journals (Sweden)

    Ming-Tsung Kao

    2014-01-01

    Full Text Available Due to the increasing number of computer hosts deployed in an enterprise, automatic management of electronic applications is inevitable. To provide diverse services, there will be increases in procurement, maintenance, and electricity costs. Virtualization technology is getting popular in cloud computing environment, which enables the efficient use of computing resources and reduces the operating cost. In this paper, we present an automatic mechanism to consolidate virtual servers and shut down the idle physical machines during the off-peak hours, while activating more machines at peak times. Through the monitoring of system resources, heavy system loads can be evenly distributed over physical machines to achieve load balancing. By integrating the feature of load balancing with virtual machine live migration, we successfully develop an automatic private cloud management system. Experimental results demonstrate that, during the off-peak hours, we can save power consumption of about 69 W by consolidating the idle virtual servers. And the load balancing implementation has shown that two machines with 80% and 40% CPU loads can be uniformly balanced to 60% each. And, through the use of preallocated virtual machine images, the proposed mechanism can be easily applied to a large amount of physical machines.

  11. Automatization of hydrodynamic modelling in a Floreon+ system

    Science.gov (United States)

    Ronovsky, Ales; Kuchar, Stepan; Podhoranyi, Michal; Vojtek, David

    2017-07-01

    The paper describes fully automatized hydrodynamic modelling as a part of the Floreon+ system. The main purpose of hydrodynamic modelling in the disaster management is to provide an accurate overview of the hydrological situation in a given river catchment. Automatization of the process as a web service could provide us with immediate data based on extreme weather conditions, such as heavy rainfall, without the intervention of an expert. Such a service can be used by non scientific users such as fire-fighter operators or representatives of a military service organizing evacuation during floods or river dam breaks. The paper describes the whole process beginning with a definition of a schematization necessary for hydrodynamic model, gathering of necessary data and its processing for a simulation, the model itself and post processing of a result and visualization on a web service. The process is demonstrated on a real data collected during floods in our Moravian-Silesian region in 2010.

  12. Movement Behaviour of Traditionally Managed Cattle in the Eastern Province of Zambia Captured Using Two-Dimensional Motion Sensors.

    Science.gov (United States)

    Lubaba, Caesar H; Hidano, Arata; Welburn, Susan C; Revie, Crawford W; Eisler, Mark C

    2015-01-01

    Two-dimensional motion sensors use electronic accelerometers to record the lying, standing and walking activity of cattle. Movement behaviour data collected automatically using these sensors over prolonged periods of time could be of use to stakeholders making management and disease control decisions in rural sub-Saharan Africa leading to potential improvements in animal health and production. Motion sensors were used in this study with the aim of monitoring and quantifying the movement behaviour of traditionally managed Angoni cattle in Petauke District in the Eastern Province of Zambia. This study was designed to assess whether motion sensors were suitable for use on traditionally managed cattle in two veterinary camps in Petauke District in the Eastern Province of Zambia. In each veterinary camp, twenty cattle were selected for study. Each animal had a motion sensor placed on its hind leg to continuously measure and record its movement behaviour over a two week period. Analysing the sensor data using principal components analysis (PCA) revealed that the majority of variability in behaviour among studied cattle could be attributed to their behaviour at night and in the morning. The behaviour at night was markedly different between veterinary camps; while differences in the morning appeared to reflect varying behaviour across all animals. The study results validate the use of such motion sensors in the chosen setting and highlight the importance of appropriate data summarisation techniques to adequately describe and compare animal movement behaviours if association to other factors, such as location, breed or health status are to be assessed.

  13. Automatized distribution systems in IBERDROLA. Sistemas de automatizacion de distribucion en Iberdrola

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Madariaga, J.A.

    1994-01-01

    This article presents the automatized distribution systems in IBERDROLA. These systems permit to improve the Energetical demand management. The optimized distribution system is a applied by the industrial sector and by the small users. Iberdrola has developed a project in order to offer the telemanagement to the energy users.

  14. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Capturing Thoughts, Capturing Minds?

    DEFF Research Database (Denmark)

    Nielsen, Janni

    2004-01-01

    Think Aloud is cost effective, promises access to the user's mind and is the applied usability technique. But 'keep talking' is difficult, besides, the multimodal interface is visual not verbal. Eye-tracking seems to get around the verbalisation problem. It captures the visual focus of attention...

  16. Automatic physical inference with information maximizing neural networks

    Science.gov (United States)

    Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-04-01

    Compressing large data sets to a manageable number of summaries that are informative about the underlying parameters vastly simplifies both frequentist and Bayesian inference. When only simulations are available, these summaries are typically chosen heuristically, so they may inadvertently miss important information. We introduce a simulation-based machine learning technique that trains artificial neural networks to find nonlinear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). In test cases where the posterior can be derived exactly, likelihood-free inference based on automatically derived IMNN summaries produces nearly exact posteriors, showing that these summaries are good approximations to sufficient statistics. In a series of numerical examples of increasing complexity and astrophysical relevance we show that IMNNs are robustly capable of automatically finding optimal, nonlinear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima. We anticipate that the automatic physical inference method described in this paper will be essential to obtain both accurate and precise cosmological parameter estimates from complex and large astronomical data sets, including those from LSST and Euclid.

  17. Exogenous (automatic) attention to emotional stimuli: a review.

    Science.gov (United States)

    Carretié, Luis

    2014-12-01

    Current knowledge on the architecture of exogenous attention (also called automatic, bottom-up, or stimulus-driven attention, among other terms) has been mainly obtained from studies employing neutral, anodyne stimuli. Since, from an evolutionary perspective, exogenous attention can be understood as an adaptive tool for rapidly detecting salient events, reorienting processing resources to them, and enhancing processing mechanisms, emotional events (which are, by definition, salient for the individual) would seem crucial to a comprehensive understanding of this process. This review, focusing on the visual modality, describes 55 experiments in which both emotional and neutral irrelevant distractors are presented at the same time as ongoing task targets. Qualitative and, when possible, meta-analytic descriptions of results are provided. The most conspicuous result is that, as confirmed by behavioral and/or neural indices, emotional distractors capture exogenous attention to a significantly greater extent than do neutral distractors. The modulatory effects of the nature of distractors capturing attention, of the ongoing task characteristics, and of individual differences, previously proposed as mediating factors, are also described. Additionally, studies reviewed here provide temporal and spatial information-partially absent in traditional cognitive models-on the neural basis of preattention/evaluation, reorienting, and sensory amplification, the main subprocesses involved in exogenous attention. A model integrating these different levels of information is proposed. The present review, which reveals that there are several key issues for which experimental data are surprisingly scarce, confirms the relevance of including emotional distractors in studies on exogenous attention.

  18. Vehicle-to-Grid Automatic Load Sharing with Driver Preference in Micro-Grids

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yubo; Nazaripouya, Hamidreza; Chu, Chi-Cheng; Gadh, Rajit; Pota, Hemanshu R.

    2014-10-15

    Integration of Electrical Vehicles (EVs) with power grid not only brings new challenges for load management, but also opportunities for distributed storage and generation. This paper comprehensively models and analyzes distributed Vehicle-to-Grid (V2G) for automatic load sharing with driver preference. In a micro-grid with limited communications, V2G EVs need to decide load sharing based on their own power and voltage profile. A droop based controller taking into account driver preference is proposed in this paper to address the distributed control of EVs. Simulations are designed for three fundamental V2G automatic load sharing scenarios that include all system dynamics of such applications. Simulation results demonstrate that active power sharing is achieved proportionally among V2G EVs with consideration of driver preference. In additional, the results also verify the system stability and reactive power sharing analysis in system modelling, which sheds light on large scale V2G automatic load sharing in more complicated cases.

  19. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    Science.gov (United States)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  20. [Interpersonal attention management inventory: a new instrument to capture different self- and external perception skills].

    Science.gov (United States)

    Blaser, Klaus; Zlabinger, Milena; Hinterberger, Thilo

    2014-01-01

    The Interpersonal Attention Management Inventory (IAMI) represents a new instrument to capture self- and external perception skills. The underlying theoretical model assumes 3 mental locations of attention (the intrapersonal space, the extrapersonal space, and the external intrapersonal space) of the other. The IAMI was studied regarding its factor structure; it was shortened and statistical values as well as first reference values were calculated based on a larger sample (n = 1089). By factor analysis, the superordinate scales could be widely validated. The shortened version with 31 items and 3 superordinate scales shows a high reliability of the global value (Cronbach's α = 0.81) and, regarding the convergent validity, a modest correlation (r = 0.41) of the global value and mindfulness, measured with the Freiburg Mindfulness Inventory (FMI). Further validation studies are invited so that the IAMI can be used as an instrument for (course) diagnosis in the therapy of psychiatric disorders as well as for research in social neuroscience, e.g., in investigations on mindfulness, compassion, empathy, theory of mind, and self-boundaries.

  1. Feasibility of Using Low-Cost Motion Capture for Automated Screening of Shoulder Motion Limitation after Breast Cancer Surgery.

    Directory of Open Access Journals (Sweden)

    Valeriya Gritsenko

    Full Text Available To determine if a low-cost, automated motion analysis system using Microsoft Kinect could accurately measure shoulder motion and detect motion impairments in women following breast cancer surgery.Descriptive study of motion measured via 2 methods.Academic cancer center oncology clinic.20 women (mean age = 60 yrs were assessed for active and passive shoulder motions during a routine post-operative clinic visit (mean = 18 days after surgery following mastectomy (n = 4 or lumpectomy (n = 16 for breast cancer.Participants performed 3 repetitions of active and passive shoulder motions on the side of the breast surgery. Arm motion was recorded using motion capture by Kinect for Windows sensor and on video. Goniometric values were determined from video recordings, while motion capture data were transformed to joint angles using 2 methods (body angle and projection angle.Correlation of motion capture with goniometry and detection of motion limitation.Active shoulder motion measured with low-cost motion capture agreed well with goniometry (r = 0.70-0.80, while passive shoulder motion measurements did not correlate well. Using motion capture, it was possible to reliably identify participants whose range of shoulder motion was reduced by 40% or more.Low-cost, automated motion analysis may be acceptable to screen for moderate to severe motion impairments in active shoulder motion. Automatic detection of motion limitation may allow quick screening to be performed in an oncologist's office and trigger timely referrals for rehabilitation.

  2. Feasibility of Using Low-Cost Motion Capture for Automated Screening of Shoulder Motion Limitation after Breast Cancer Surgery.

    Science.gov (United States)

    Gritsenko, Valeriya; Dailey, Eric; Kyle, Nicholas; Taylor, Matt; Whittacre, Sean; Swisher, Anne K

    2015-01-01

    To determine if a low-cost, automated motion analysis system using Microsoft Kinect could accurately measure shoulder motion and detect motion impairments in women following breast cancer surgery. Descriptive study of motion measured via 2 methods. Academic cancer center oncology clinic. 20 women (mean age = 60 yrs) were assessed for active and passive shoulder motions during a routine post-operative clinic visit (mean = 18 days after surgery) following mastectomy (n = 4) or lumpectomy (n = 16) for breast cancer. Participants performed 3 repetitions of active and passive shoulder motions on the side of the breast surgery. Arm motion was recorded using motion capture by Kinect for Windows sensor and on video. Goniometric values were determined from video recordings, while motion capture data were transformed to joint angles using 2 methods (body angle and projection angle). Correlation of motion capture with goniometry and detection of motion limitation. Active shoulder motion measured with low-cost motion capture agreed well with goniometry (r = 0.70-0.80), while passive shoulder motion measurements did not correlate well. Using motion capture, it was possible to reliably identify participants whose range of shoulder motion was reduced by 40% or more. Low-cost, automated motion analysis may be acceptable to screen for moderate to severe motion impairments in active shoulder motion. Automatic detection of motion limitation may allow quick screening to be performed in an oncologist's office and trigger timely referrals for rehabilitation.

  3. Metadata capture in an electronic notebook: How to make it as simple as possible?

    Directory of Open Access Journals (Sweden)

    Menzel, Julia

    2015-09-01

    Full Text Available In the last few years electronic laboratory notebooks (ELNs have become popular. ELNs offer the great possibility to capture metadata automatically. Due to the high documentation effort metadata documentation is neglected in science. To close the gap between good data documentation and high documentation effort for the scientists a first user-friendly solution to capture metadata in an easy way was developed.At first, different protocols for the Western Blot were collected within the Collaborative Research Center 1002 and analyzed. Together with existing metadata standards identified in a literature search a first version of the metadata scheme was developed. Secondly, the metadata scheme was customized for future users including the implementation of default values for automated metadata documentation.Twelve protocols for the Western Blot were used to construct one standard protocol with ten different experimental steps. Three already existing metadata standards were used as models to construct the first version of the metadata scheme consisting of 133 data fields in ten experimental steps. Through a revision with future users the final metadata scheme was shortened to 90 items in three experimental steps. Using individualized default values 51.1% of the metadata can be captured with present values in the ELN.This lowers the data documentation effort. At the same time, researcher could benefit by providing standardized metadata for data sharing and re-use.

  4. Automatic extraction of drug indications from FDA drug labels.

    Science.gov (United States)

    Khare, Ritu; Wei, Chih-Hsuan; Lu, Zhiyong

    2014-01-01

    Extracting computable indications, i.e. drug-disease treatment relationships, from narrative drug resources is the key for building a gold standard drug indication repository. The two steps to the extraction problem are disease named-entity recognition (NER) to identify disease mentions from a free-text description and disease classification to distinguish indications from other disease mentions in the description. While there exist many tools for disease NER, disease classification is mostly achieved through human annotations. For example, we recently resorted to human annotations to prepare a corpus, LabeledIn, capturing structured indications from the drug labels submitted to FDA by pharmaceutical companies. In this study, we present an automatic end-to-end framework to extract structured and normalized indications from FDA drug labels. In addition to automatic disease NER, a key component of our framework is a machine learning method that is trained on the LabeledIn corpus to classify the NER-computed disease mentions as "indication vs. non-indication." Through experiments with 500 drug labels, our end-to-end system delivered 86.3% F1-measure in drug indication extraction, with 17% improvement over baseline. Further analysis shows that the indication classifier delivers a performance comparable to human experts and that the remaining errors are mostly due to disease NER (more than 50%). Given its performance, we conclude that our end-to-end approach has the potential to significantly reduce human annotation costs.

  5. Development of a cerebral circulation model for the automatic control of brain physiology.

    Science.gov (United States)

    Utsuki, T

    2015-01-01

    In various clinical guidelines of brain injury, intracranial pressure (ICP), cerebral blood flow (CBF) and brain temperature (BT) are essential targets for precise management for brain resuscitation. In addition, the integrated automatic control of BT, ICP, and CBF is required for improving therapeutic effects and reducing medical costs and staff burden. Thus, a new model of cerebral circulation was developed in this study for integrative automatic control. With this model, the CBF and cerebral perfusion pressure of a normal adult male were regionally calculated according to cerebrovascular structure, blood viscosity, blood distribution, CBF autoregulation, and ICP. The analysis results were consistent with physiological knowledge already obtained with conventional studies. Therefore, the developed model is potentially available for the integrative control of the physiological state of the brain as a reference model of an automatic control system, or as a controlled object in various control simulations.

  6. Fully automatic measurements of axial vertebral rotation for assessment of spinal deformity in idiopathic scoliosis

    International Nuclear Information System (INIS)

    Forsberg, Daniel; Andersson, Mats; Knutsson, Hans; Lundström, Claes; Vavruch, Ludvig; Tropp, Hans

    2013-01-01

    Reliable measurements of spinal deformities in idiopathic scoliosis are vital, since they are used for assessing the degree of scoliosis, deciding upon treatment and monitoring the progression of the disease. However, commonly used two dimensional methods (e.g. the Cobb angle) do not fully capture the three dimensional deformity at hand in scoliosis, of which axial vertebral rotation (AVR) is considered to be of great importance. There are manual methods for measuring the AVR, but they are often time-consuming and related with a high intra- and inter-observer variability. In this paper, we present a fully automatic method for estimating the AVR in images from computed tomography. The proposed method is evaluated on four scoliotic patients with 17 vertebrae each and compared with manual measurements performed by three observers using the standard method by Aaro–Dahlborn. The comparison shows that the difference in measured AVR between automatic and manual measurements are on the same level as the inter-observer difference. This is further supported by a high intraclass correlation coefficient (0.971–0.979), obtained when comparing the automatic measurements with the manual measurements of each observer. Hence, the provided results and the computational performance, only requiring approximately 10 to 15 s for processing an entire volume, demonstrate the potential clinical value of the proposed method. (paper)

  7. An Automatic Prediction of Epileptic Seizures Using Cloud Computing and Wireless Sensor Networks.

    Science.gov (United States)

    Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar

    2016-11-01

    Epilepsy is one of the most common neurological disorders which is characterized by the spontaneous and unforeseeable occurrence of seizures. An automatic prediction of seizure can protect the patients from accidents and save their life. In this article, we proposed a mobile-based framework that automatically predict seizures using the information contained in electroencephalography (EEG) signals. The wireless sensor technology is used to capture the EEG signals of patients. The cloud-based services are used to collect and analyze the EEG data from the patient's mobile phone. The features from the EEG signal are extracted using the fast Walsh-Hadamard transform (FWHT). The Higher Order Spectral Analysis (HOSA) is applied to FWHT coefficients in order to select the features set relevant to normal, preictal and ictal states of seizure. We subsequently exploit the selected features as input to a k-means classifier to detect epileptic seizure states in a reasonable time. The performance of the proposed model is tested on Amazon EC2 cloud and compared in terms of execution time and accuracy. The findings show that with selected HOS based features, we were able to achieve a classification accuracy of 94.6 %.

  8. Carbon captured from the air

    International Nuclear Information System (INIS)

    Keith, D.

    2008-01-01

    This article presented an innovative way to achieve the efficient capture of atmospheric carbon. A team of scientists from the University of Calgary's Institute for Sustainable Energy, Environment and Economy have shown that it is possible to reduce carbon dioxide (CO 2 ) using a simple machine that can capture the trace amount of CO 2 present in ambient air at any place on the planet. The thermodynamics of capturing the small concentrations of CO 2 from the air is only slightly more difficult than capturing much larger concentrations of CO 2 from power plants. The research is significant because it offers a way to capture CO 2 emissions from transportation sources such as vehicles and airplanes, which represent more than half of the greenhouse gases emitted on Earth. The energy efficient and cost effective air capture technology could complement other approaches for reducing emissions from the transportation sector, such as biofuels and electric vehicles. Air capture differs from carbon capture and storage (CCS) technology used at coal-fired power plants where CO 2 is captured and pipelined for permanent storage underground. Air capture can capture the CO 2 that is present in ambient air and store it wherever it is cheapest. The team at the University of Calgary showed that CO 2 could be captured directly from the air with less than 100 kWhrs of electricity per tonne of CO 2 . A custom-built tower was able to capture the equivalent of 20 tonnes per year of CO 2 on a single square meter of scrubbing material. The team devised a way to use a chemical process from the pulp and paper industry to cut the energy cost of air capture in half. Although the technology is only in its early stage, it appears that CO 2 could be captured from the air with an energy demand comparable to that needed for CO 2 capture from conventional power plants, but costs will be higher. The simple, reliable and scalable technology offers an opportunity to build a commercial-scale plant. 1 fig

  9. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  10. Study on the management of the Boohung X-Dol 90 developer and fixing solution for automatic X-ray film processor

    International Nuclear Information System (INIS)

    Hyan, Yong Sil; Kim, Heung Tae; Kwon, Dal Gwan; Choi, Myung Joon; Cheung, Hwan

    1986-01-01

    Recently, Demands of Automatic X-ray film Processors are increasing more and more at University Hospitals and general Hospitals and Private clinics, but various troubles because of incorrect control were found out. Authors have researched to find out the function and Activity of Automatic X-ray film processor for 2 weeks Kodak RPX-OMAT Processor and Sakura GX3000 Processor and Doosan parka 2000 Processor and results obtained were as follows: 1. Automatic X-ray film processor have an advantage to conduct the rapid treatment of X-ray film processing but incorrect handling of developing and fixing agents were brought about a great change in Contrast and Optical density of X-ray film pictures. 2. About 300 X-ray film could be finished by same developing and fixing solution without exchanging any other solutions in each Automatic X-ray film processor

  11. Capture by colour: evidence for dimension-specific singleton capture.

    Science.gov (United States)

    Harris, Anthony M; Becker, Stefanie I; Remington, Roger W

    2015-10-01

    Previous work on attentional capture has shown the attentional system to be quite flexible in the stimulus properties it can be set to respond to. Several different attentional "modes" have been identified. Feature search mode allows attention to be set for specific features of a target (e.g., red). Singleton detection mode sets attention to respond to any discrepant item ("singleton") in the display. Relational search sets attention for the relative properties of the target in relation to the distractors (e.g., redder, larger). Recently, a new attentional mode was proposed that sets attention to respond to any singleton within a particular feature dimension (e.g., colour; Folk & Anderson, 2010). We tested this proposal against the predictions of previously established attentional modes. In a spatial cueing paradigm, participants searched for a colour target that was randomly either red or green. The nature of the attentional control setting was probed by presenting an irrelevant singleton cue prior to the target display and assessing whether it attracted attention. In all experiments, the cues were red, green, blue, or a white stimulus rapidly rotated (motion cue). The results of three experiments support the existence of a "colour singleton set," finding that all colour cues captured attention strongly, while motion cues captured attention only weakly or not at all. Notably, we also found that capture by motion cues in search for colour targets was moderated by their frequency; rare motion cues captured attention (weakly), while frequent motion cues did not.

  12. Materials For Gas Capture, Methods Of Making Materials For Gas Capture, And Methods Of Capturing Gas

    KAUST Repository

    Polshettiwar, Vivek

    2013-06-20

    In accordance with the purpose(s) of the present disclosure, as embodied and broadly described herein, embodiments of the present disclosure, in one aspect, relate to materials that can be used for gas (e.g., CO.sub.2) capture, methods of making materials, methods of capturing gas (e.g., CO.sub.2), and the like, and the like.

  13. An Automatic Monitoring System for High-Frequency Measuring and Real-Time Management of Cyanobacterial Blooms in Urban Water Bodies

    Directory of Open Access Journals (Sweden)

    Viet Tran Khac

    2018-01-01

    Full Text Available Urban lakes mitigate the negative impacts on the hydrological cycle and improve the quality of life in cities. Worldwide, the concern increases for the protection and management of urban water bodies. Since the physical-chemical and biological conditions of a small aquatic ecosystem can vary rapidly over time, traditional low frequency measurement approaches (weekly or monthly sampling limits the knowledge and the transfer of research outcomes to management decision-making. In this context, this paper presents an automatic monitoring system including a full-scale experimental site and a data transfer platform for high-frequency observations (every 5 min in a small and shallow urban lake (Lake Champs-sur-Marne, Paris, France, 10.3 ha. Lake stratification and mixing periods can be clearly observed, these periods are compared with the dynamic patterns of chlorophyll-a, phycocyanin, dissolved oxygen and pH. The results indicate that the phytoplankton growth corresponds with dissolved oxygen cycles. However, thermal stratification cannot totally explain the entire dynamic patterns of different physical-chemical and ecological variables. Besides, the cyanobacteria is one of the dominating groups of phytoplankton blooms during the lake stratification periods (8 August–29 September 2016. During the cooling mixed period (29 September–19 October 2016, the high concentration of chlorophyll-a is mainly caused by the other phytoplankton species, such as diatoms. Perspectives are discussed in order to apply this observation system for real-time management of water bodies and lakes.

  14. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  15. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    International Nuclear Information System (INIS)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship 'Mutsu'. Automation technique was verified by 'Mutsu' plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  16. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  17. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  18. Interatomic Coulombic electron capture

    International Nuclear Information System (INIS)

    Gokhberg, K.; Cederbaum, L. S.

    2010-01-01

    In a previous publication [K. Gokhberg and L. S. Cederbaum, J. Phys. B 42, 231001 (2009)] we presented the interatomic Coulombic electron capture process--an efficient electron capture mechanism by atoms and ions in the presence of an environment. In the present work we derive and discuss the mechanism in detail. We demonstrate thereby that this mechanism belongs to a family of interatomic electron capture processes driven by electron correlation. In these processes the excess energy released in the capture event is transferred to the environment and used to ionize (or to excite) it. This family includes the processes where the capture is into the lowest or into an excited unoccupied orbital of an atom or ion and proceeds in step with the ionization (or excitation) of the environment, as well as the process where an intermediate autoionizing excited resonance state is formed in the capturing center which subsequently deexcites to a stable state transferring its excess energy to the environment. Detailed derivation of the asymptotic cross sections of these processes is presented. The derived expressions make clear that the environment assisted capture processes can be important for many systems. Illustrative examples are presented for a number of model systems for which the data needed to construct the various capture cross sections are available in the literature.

  19. Perspective of the applications of automatic identification technologies in the Serbian Army

    Directory of Open Access Journals (Sweden)

    Velibor V. Jovanović

    2012-07-01

    Full Text Available Without modern information systems, supply-chain management is almost impossible. Automatic identification technologies provide automated data processing, which contributes to improving the conditions and support decision making. Automatic identification technology media, notably BARCODE and RFID technology, are used as carriers of labels with high quality data and adequate description of material means, for providing a crucial visibility of inventory levels through the supply chain. With these media and the use of an adequate information system, the Ministry of Defense of the Republic of Serbia will be able to establish a system of codification and, in accordance with the NATO codification system, to successfully implement a unique codification, classification and determination of storage numbers for all tools, components and spare parts for their unequivocal identification. In the perspective, this will help end users to perform everyday tasks without compromising the material integrity of security data. It will also help command structures to have reliable information for decision making to ensure optimal management. Products and services that pass the codification procedure will have the opportunity to be offered in the largest market of armament and military equipment. This paper gives a comparative analysis of two automatic identification technologies - BARCODE, the most common one, and RFID, the most advanced one - with an emphasis on the advantages and disadvantages of their use in tracking inventory through the supply chain. Their possible application in the Serbian Army is discussed in general.

  20. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  1. Carbon captured from the air

    Energy Technology Data Exchange (ETDEWEB)

    Keith, D. [Calgary Univ., AB (Canada)

    2008-10-15

    This article presented an innovative way to achieve the efficient capture of atmospheric carbon. A team of scientists from the University of Calgary's Institute for Sustainable Energy, Environment and Economy have shown that it is possible to reduce carbon dioxide (CO{sub 2}) using a simple machine that can capture the trace amount of CO{sub 2} present in ambient air at any place on the planet. The thermodynamics of capturing the small concentrations of CO{sub 2} from the air is only slightly more difficult than capturing much larger concentrations of CO{sub 2} from power plants. The research is significant because it offers a way to capture CO{sub 2} emissions from transportation sources such as vehicles and airplanes, which represent more than half of the greenhouse gases emitted on Earth. The energy efficient and cost effective air capture technology could complement other approaches for reducing emissions from the transportation sector, such as biofuels and electric vehicles. Air capture differs from carbon capture and storage (CCS) technology used at coal-fired power plants where CO{sub 2} is captured and pipelined for permanent storage underground. Air capture can capture the CO{sub 2} that is present in ambient air and store it wherever it is cheapest. The team at the University of Calgary showed that CO{sub 2} could be captured directly from the air with less than 100 kWhrs of electricity per tonne of CO{sub 2}. A custom-built tower was able to capture the equivalent of 20 tonnes per year of CO{sub 2} on a single square meter of scrubbing material. The team devised a way to use a chemical process from the pulp and paper industry to cut the energy cost of air capture in half. Although the technology is only in its early stage, it appears that CO{sub 2} could be captured from the air with an energy demand comparable to that needed for CO{sub 2} capture from conventional power plants, but costs will be higher. The simple, reliable and scalable technology

  2. Elite Capture and Corruption in two Villages in Bengkulu Province, Sumatra.

    Science.gov (United States)

    Lucas, Anton

    This paper examines leadership, elite capture and corruption in two villages in Sumatra. It compares implementation and outcomes of several conservation and development projects in the context of democratization and decentralization reforms introduced in Indonesia since 1998. In examining aspects of elite control and elite capture, this paper focuses on the activities of local elites, particularly village officials, who use their positions to monopolize planning and management of projects that were explicitly intended to incorporate participatory and accountability features. While elites' use of authority and influence to benefit personally from their roles clearly reflects elite capture, there are nonetheless members of elite groups in these case studies who use their control of projects to broad community benefit. In both villages there is considerable friction between villagers and elites as well as among members of the local elite themselves over control of local resources. Differences in the structure of these cross-cutting internal relationships and of ties between local authorities and outside government and non-government agents largely explain the differences in degree of elite capture and its outcomes in the two cases.

  3. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  4. 'H-Bahn' - Dortmund demonstration system. Automatic vehicle protection system

    Energy Technology Data Exchange (ETDEWEB)

    Rosenkranz

    1984-01-01

    The automatic vehicle protection system of the H-Bahn at the Universtiy of Dortmund is responsible for fail-safe operating of the automatic vehicles. Its functions are protection of vehicle operation and protection of passengers boarding and leaving the vehicles. These functions are managed decentrally by two fail-safe operating controllers. Besides the well-known relay-techniques of railway-fail-safe systems, electronics are applied which are based on safe operating URTL-microcontrollers. These are controlled by software stored in EPROMs. A connection link using glass-fibres serves for safe data-exchange between the two fail-safe operating controllers. The experts' favourable reports on 'train protection and safety during passenger processing' were completed in March 84; thus, transportation of passengers could start in April 84.

  5. Improvement and automatization of a proportional alpha-beta counting system - FAG

    International Nuclear Information System (INIS)

    German, U.; Levinson, S.; Pelled, O.; Shemesh, Y.; Assido, H.

    1997-01-01

    An alpha and beta counting system - FAG*, for planchette samples is operated at the Health Physics department's laboratory of the NRCN. The original operation mode of the system was based on manual tasks handled by the FHT1 100 electronics. An option for a basic computer keyboard operation was available too. A computer with an appropriate I/O card was connected to the system and a new operating program was developed which enables full automatic control of the various components. The program includes activity calculations and statistical checks as well as data management. A bar-code laser system for sample number reading was integrated into the Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. authors)

  6. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  7. Automatic categorization of diverse experimental information in the bioscience literature

    Science.gov (United States)

    2012-01-01

    Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at

  8. Using a Geographic Information System to Assess Site Suitability for Managed Aquifer Recharge using Stormwater Capture

    Science.gov (United States)

    Teo, E. K.; Harmon, R. E.; Beganskas, S.; Young, K. S.; Fisher, A. T.; Weir, W. B.; Lozano, S.

    2015-12-01

    We are completing a regional analysis of Santa Cruz and northern Monterey Counties, CA, to assess the conditions amenable to managed aquifer recharge using stormwater runoff. Communities and water supply agencies across CA are struggling to mitigate the ongoing drought and to develop secure and sustainable water supplies to support long-term municipal, agricultural, environmental and other needs. Enhanced storage of groundwater is an important part of this effort in many basins. This work is especially timely because of the recently enacted "Sustainable Groundwater Management Act" (SGMA), which requires the development of groundwater sustainability agencies and implementation of basin management plans in coming decades. Our analysis focuses specifically on the distributed collection of stormwater runoff, a water source that has typically been treated as a nuisance or waste, from drainages having an area on the order of 40-160 hectares. The first part of this project is a geographic information system (GIS) analysis using surface and subsurface data sets. Developing complete and accurate datasets across the study region required considerable effort to locate, assemble, co-register, patch, and reconcile information from many sources and scales. We have complete spatial coverage for surface data, but subsurface data is more limited in lateral extent. Sites that are most suitable for distributed stormwater capture supporting MAR have high soil infiltration capacity, are well-connected to an underlying aquifer with good transmissive and storage properties, and have space to receive MAR. Additional considerations include method of infiltration, slope, and land use and access. Based on initial consideration of surface data and slope, 7% of the complete study region appears to be "suitable or highly suitable" for MAR (in the top third of the rating system), but there is considerable spatial heterogeneity based on the distribution of shallow soils and bedrock geology.

  9. Effects on automatic attention due to exposure to pictures of emotional faces while performing Chinese word judgment tasks.

    Science.gov (United States)

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.

  10. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  11. vMMN for schematic faces: automatic detection of change in emotional expression

    Directory of Open Access Journals (Sweden)

    Kairi eKreegipuu

    2013-10-01

    Full Text Available Our brain is able to automatically detect changes in sensory stimulation, including in vision. A large variety of changes of features in stimulation elicit a deviance-reflecting ERP component known as the mismatch negativity (MMN. The present study has three main goals: (1 to register vMMN using a rapidly presented stream of schematic faces (neutral, happy, angry; adapted from Öhman et al., 2001; (2 to compare elicited vMMNs to angry and happy schematic faces in two different paradigms, in a traditional oddball design with frequent standard and rare target and deviant stimuli (12.5% each and in an version of an optimal multi-feature paradigm with several deviant stimuli (altogether 37.5% in the stimulus block; (3 to compare vMMNs to subjective ratings of valence, arousal and attention capture for happy and angry schematic faces, i.e., to estimate the effect of affective value of stimuli on their automatic detection. Eleven observers (19-32 years, 6 women took part in both experiments, an oddball and optimum paradigm. Stimuli were rapidly presented schematic faces and an object with face-features that served as the target stimulus to be detected by a button-press. Results show that a vMMN-type response at posterior sites was equally elicited in both experiments. Post-experimental reports confirmed that the angry face attracted more automatic attention than the happy face but the difference did not emerge directly at the ERP level. Thus, when interested in studying change detection in facial expressions we encourage the use of the optimum (multi-feature design in order to save time and other experimental resources.

  12. Comparison of manual and semi-automatic measuring techniques in MSCT scans of patients with lymphoma: a multicentre study

    Energy Technology Data Exchange (ETDEWEB)

    Hoeink, A.J.; Wessling, J.; Schuelke, C.; Kohlhase, N.; Wassenaar, L.; Heindel, W.; Buerke, B. [University Hospital Muenster, Department of Clinical Radiology, Muenster (Germany); Koch, R. [University of Muenster, Institute of Biostatistics and Clinical Research (IBKF), Muenster (Germany); Mesters, R.M. [University Hospital Muenster, Department of Haematology and Oncology, Muenster (Germany); D' Anastasi, M.; Graser, A.; Karpitschka, M. [University Hospital Muenchen (LMU), Institute of Clinical Radiology, Muenchen (Germany); Fabel, M.; Wulff, A. [University Hospital Kiel, Department of Clinical Radiology, Kiel (Germany); Pinto dos Santos, D. [University Hospital Mainz, Department of Diagnostic and Interventional Radiology, Mainz (Germany); Kiessling, A. [University Hospital Marburg, Department of Diagnostic and Interventional Radiology, Marburg (Germany); Dicken, V.; Bornemann, L. [Institute of Medical Imaging Computing, Fraunhofer MeVis, Bremen (Germany)

    2014-11-15

    Multicentre evaluation of the precision of semi-automatic 2D/3D measurements in comparison to manual, linear measurements of lymph nodes regarding their inter-observer variability in multi-slice CT (MSCT) of patients with lymphoma. MSCT data of 63 patients were interpreted before and after chemotherapy by one/tworadiologists in five university hospitals. In 307 lymph nodes, short (SAD)/long (LAD) axis diameter and WHO area were determined manually and semi-automatically. Volume was solely calculated semi-automatically. To determine the precision of the individual parameters, a mean was calculated for every lymph node/parameter. Deviation of the measured parameters from this mean was evaluated separately. Statistical analysis entailed intraclass correlation coefficients (ICC) and Kruskal-Wallis tests. Median relative deviations of semi-automatic parameters were smaller than deviations of manually assessed parameters, e.g. semi-automatic SAD 5.3 vs. manual 6.5 %. Median variations among different study sites were smaller if the measurement was conducted semi-automatically, e. g. manual LAD 5.7/4.2 % vs. semi-automatic 3.4/3.4 %. Semi-automatic volumetry was superior to the other parameters (2.8 %). Semi-automatic determination of different lymph node parameters is (compared to manually assessed parameters) associated with a slightly greater precision and a marginally lower inter-observer variability. These results are with regard to the increasing mobility of patients among different medical centres and in relation to the quality management of multicentre trials of importance. (orig.)

  13. On the question of the necessity of implementation of automatic control systems in timber industry

    Science.gov (United States)

    Khasanov, E. R.; Zelenkov, P. V.; Petrosyan, M. O.; Murygin, A. V.; Laptenor, V. D.

    2016-04-01

    The paper considers the necessity of implementation of automatic control systems on the level of forest farms management and timber industry. Main areas of activity, which currently subjected to automation, are revealed. Objectives, which solved by implementation of APCS, are identified.

  14. Microsoft excel's automatic data processing and diagram drawing of RIA internal quality control parameters

    International Nuclear Information System (INIS)

    Zeng Pingfan; Liu Guoqiang

    2006-01-01

    We did automatic data processing and diagram drawing of various parameters of RIA' s internal quality control (IQC)by the use of Microsoft Excel (ME). By use of AVERAGE and STDEV of ME, we got x-bar, s and CV%. With pearson, we got the serum quality control coefficients (r). Inputing the original data to diagram's self-definition item, the diagram was drawn automatically. By the use of logic judging, we got the quality control judging results with the status, timing and data of various quality control parameters. For the past four years, the ME data processing and diagram drawing as well as quality control judging have been showed to be accurate, convenient and correct. It was quick and easy to manage and the automatic computer processing of RIA's IQC was realized. Conclusion: the method is applicable to all types of RIA' s IQC. (authors)

  15. Automatic Detection of Storm Damages Using High-Altitude Photogrammetric Imaging

    Science.gov (United States)

    Litkey, P.; Nurminen, K.; Honkavaara, E.

    2013-05-01

    The risks of storms that cause damage in forests are increasing due to climate change. Quickly detecting fallen trees, assessing the amount of fallen trees and efficiently collecting them are of great importance for economic and environmental reasons. Visually detecting and delineating storm damage is a laborious and error-prone process; thus, it is important to develop cost-efficient and highly automated methods. Objective of our research project is to investigate and develop a reliable and efficient method for automatic storm damage detection, which is based on airborne imagery that is collected after a storm. The requirements for the method are the before-storm and after-storm surface models. A difference surface is calculated using two DSMs and the locations where significant changes have appeared are automatically detected. In our previous research we used four-year old airborne laser scanning surface model as the before-storm surface. The after-storm DSM was provided from the photogrammetric images using the Next Generation Automatic Terrain Extraction (NGATE) algorithm of Socet Set software. We obtained 100% accuracy in detection of major storm damages. In this investigation we will further evaluate the sensitivity of the storm-damage detection process. We will investigate the potential of national airborne photography, that is collected at no-leaf season, to automatically produce a before-storm DSM using image matching. We will also compare impact of the terrain extraction algorithm to the results. Our results will also promote the potential of national open source data sets in the management of natural disasters.

  16. The 59 meter dash - automatic rapid meter reading in Ronneby

    Energy Technology Data Exchange (ETDEWEB)

    Ottosson, Hans [Enersearch (Sweden); Selander, Lars [Linkoeping Univ. (Sweden); Bergstroem, Ulrika [Sydkraft (Sweden)

    1999-02-01

    As a result of deregulation of the telecommunications and energy markets, the utilities in Sweden see opportunities to use power lines for additional profitable applications such as transmission of data; the technology is called Power Line Telecommunications (PLT). The potential advantages are said to be 'massive'. The potential applications include remote security, automatic meter reading, load management and 'smart' home automation. A small scale feasibility study has been carried out in Ronneby in Sweden where it was shown that load management and efficiency improvements can reduce the costs of supplying the town with heat and electricity by about 3%. The Ronneby trial is described in detail. Since Scandinavia makes use of weather-dependent renewables for much of its power generation, load management is an attractive potential application of PLT. (UK)

  17. Capturing Talent: Generation Y and European Labor Markets

    OpenAIRE

    GAYLE ALLARD; Cristina Simón; RAQUEL MARTIN

    2007-01-01

    This study explores the challenge of capturing talent from both the political and the management level in Western Europe. It begins by identifying the special characteristics of Generation Y: those born since 1980 and recently joining national labor forces. It then evaluates the rigidity of labor markets in the European countries, dividing them into most and least regulated and exploring some of the labor-market characteristics that accompany those extremes. Finally, it identifies the employm...

  18. Preliminary carbon dioxide capture technical and economic feasibility study evaluation of carbon dioxide capture from existing fired plants by hybrid sorption using solid sorbents

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Steven; Envergex, Srivats; Browers, Bruce; Thumbi, Charles

    2013-01-01

    Barr Engineering Co. was retained by the Institute for Energy Studies (IES) at University of North Dakota (UND) to conduct a technical and economic feasibility analysis of an innovative hybrid sorbent technology (CACHYS™) for carbon dioxide (CO2) capture and separation from coal combustion–derived flue gas. The project team for this effort consists of the University of North Dakota, Envergex LLC, Barr Engineering Co., and Solex Thermal Science, along with industrial support from Allete, BNI Coal, SaskPower, and the North Dakota Lignite Energy Council. An initial economic and feasibility study of the CACHYS™ concept, including definition of the process, development of process flow diagrams (PFDs), material and energy balances, equipment selection, sizing and costing, and estimation of overall capital and operating costs, is performed by Barr with information provided by UND and Envergex. The technology—Capture from Existing Coal-Fired Plants by Hybrid Sorption Using Solid Sorbents Capture (CACHYS™)—is a novel solid sorbent technology based on the following ideas: reduction of energy for sorbent regeneration, utilization of novel process chemistry, contactor conditions that minimize sorbent-CO2 heat of reaction and promote fast CO2 capture, and a low-cost method of heat management. The technology’s other key component is the use of a low-cost sorbent.

  19. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  20. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  1. Automatic three-dimensional quantitative analysis for evaluation of facial movement.

    Science.gov (United States)

    Hontanilla, B; Aubá, C

    2008-01-01

    The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.

  2. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    Science.gov (United States)

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks.

  3. Managing Above the Graft: How Management Needs its Fertile ...

    African Journals Online (AJOL)

    Managing Above the Graft: How Management Needs its Fertile Wounds from which ... both captured the imagination of the employees and benefited the core business of a ... The decision was fortuitous, given that the leadership development ...

  4. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  5. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  6. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  7. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  8. Automatic exchange unit for control rod drive device

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To enable automatic reoperation and continuation without external power interruption remedy device at the time of recovering the interrupted power soruce during automatic positioning operation. Constitution: In case of an automatic exchange unit for a control rod drive device of the control type for setting the deviation between the positioning target position and the present position of the device to zero, the position data of the drive device of the positioning target value of the device is automatically read, and an interlock of operation inhibit is applied to a control system until the data reading is completed and automatic operation start or restart conditions are sequentially confirmed. After the confirmation, the interlock is released to start the automatic operation or reoperation. Accordingly, the automatic operation can be safely restarted and continued. (Yoshihara, H.)

  9. The Impact of Lecture Capture on Student Performance in Business Courses

    Science.gov (United States)

    Terry, Neil; Macy, Anne; Clark, Robin; Sanders, Gary

    2015-01-01

    This paper examines the effect of the e-learning technology of lecture capture on the performance of undergraduate business students in business law, economics, finance, and management courses. The sample consists of 890 student observations at a midsized regional institution located in the Southwestern region of the United States. The dependent…

  10. Failure of the extended contingent attentional capture account in multimodal settings

    Directory of Open Access Journals (Sweden)

    Rob H.J. Van der Lubbe

    2006-01-01

    Full Text Available Sudden changes in our environment like sound bursts or light flashes are thought to automatically attract our attention thereby affecting responses to subsequent targets, although an alternative view (the contingent attentional capture account holds that stimuli only capture our attention when they match target features. In the current study, we examined whether an extended version of the latter view can explain exogenous cuing effects on speed and accuracy of performance to targets (uncued-cued in multimodal settings, in which auditory and visual stimuli co-occur. To this end, we determined whether observed effects of visual and auditory cues, which were always intermixed, depend on top-down settings in "pure" blocks, in which only one target modality occurred, as compared to "mixed" blocks, in which targets were either visual or auditory. Results revealed that unimodal and crossmodal cuing effects depend on top-down settings. However, our findingswerenot in accordance with predictions derived from the extended contingent attentional capture account. Specifically,visual cues showed comparable effects for visual targets in pure and mixed blocks, but also a comparable effect for auditory targets in pure blocks, and most surprisingly, an opposite effect in mixed blocks. The latter result suggests that visual stimuli may distract attention from the auditory modality in case when the modality of the forthcoming target is unknown. The results additionally revealed that the Simon effect, the influence of correspondence or not between stimulus and response side, is modulated by exogenous cues in unimodal settings, but not in crossmodal settings. These findings accord with the view that attention plays an important role for the Simon effect, and additionally questions the directness of links between maps of visual and auditory space.

  11. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    Science.gov (United States)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  12. Improve wildlife species tracking—Implementing an enhanced global positioning system data management system for California condors

    Science.gov (United States)

    Waltermire, Robert G.; Emmerich, Christopher U.; Mendenhall, Laura C.; Bohrer, Gil; Weinzierl, Rolf P.; McGann, Andrew J.; Lineback, Pat K.; Kern, Tim J.; Douglas, David C.

    2016-05-03

    U.S. Fish and Wildlife Service (USFWS) staff in the Pacific Southwest Region and at the Hopper Mountain National Wildlife Refuge Complex requested technical assistance to improve their global positioning system (GPS) data acquisition, management, and archive in support of the California Condor Recovery Program. The USFWS deployed and maintained GPS units on individual Gymnogyps californianus (California condor) in support of long-term research and daily operational monitoring and management of California condors. The U.S. Geological Survey (USGS) obtained funding through the Science Support Program to provide coordination among project participants, provide GPS Global System for Mobile Communication (GSM) transmitters for testing, and compare GSM/GPS with existing Argos satellite GPS technology. The USFWS staff worked with private companies to design, develop, and fit condors with GSM/GPS transmitters. The Movebank organization, an online database of animal tracking data, coordinated with each of these companies to automatically stream their GPS data into Movebank servers and coordinated with USFWS to improve Movebank software for managing transmitter data, including proofing/error checking of incoming GPS data. The USGS arranged to pull raw GPS data from Movebank into the USGS California Condor Management and Analysis Portal (CCMAP) (https://my.usgs.gov/ccmap) for production and dissemination of a daily map of condor movements including various automated alerts. Further, the USGS developed an automatic archiving system for pulling raw and proofed Movebank data into USGS ScienceBase to comply with the Federal Information Security Management Act of 2002. This improved data management system requires minimal manual intervention resulting in more efficient data flow from GPS data capture to archive status. As a result of the project’s success, Pinnacles National Park and the Ventana Wildlife Society California condor programs became partners and adopted the same

  13. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  14. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  15. Simple technique to treat pupillary capture after transscleral fixation of intraocular lens.

    Science.gov (United States)

    Jürgens, Ignasi; Rey, Amanda

    2015-01-01

    We describe a simple surgical technique to manage pupillary capture after previous transscleral fixation of an intraocular lens. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  16. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  17. Disruption management in a two-period three-tier electronics supply chain

    Directory of Open Access Journals (Sweden)

    Johannes Danusantoso

    2016-12-01

    Full Text Available We study strategies to manage demand disruptions in a three-tier electronics supply chain consisting of an Electronics Manufacturing Services provider, an Original Equipment Manufacturer (OEM, and a Retailer. We model price sensitivity of consumer demand with the two functions commonly used for this purpose, linear and exponential, and introduce disruptions in the demand function. We assume each supply chain member faces an increasing marginal unit cost function. Our decentralized supply chain setting is governed by a wholesale price contract. The OEM possesses greater bargaining power and therefore is the Stackelberg leader. A penalty cost incurred by the Retailer is introduced to capture the cost of deviation from the original plan. We find exact analytical solutions of the effectiveness of managing the disruption when the consumer demand function is linear, and we provide numerical examples as an illustration when the consumer demand function is either linear or exponential. We show that the original production quantity exhibits some robustness under disruptions in both centralized and decentralized supply chains, while the original optimal pricing does not. We show that supply chain managers should not automatically react to an individual disruption, in certain cases it is best to leave the production plan unchanged.

  18. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  19. Automated attendance management and alert system | Rahim ...

    African Journals Online (AJOL)

    Automated attendance management and alert system. ... Journal of Fundamental and Applied Sciences ... AAMAS provides various functions, from managing and recording students' attendance record, to sending automatic alerts to students ...

  20. Density Functional Theory Calculations of Activation Energies for Carrier Capture by Defects in Semiconductors

    Science.gov (United States)

    Modine, N. A.; Wright, A. F.; Lee, S. R.

    The rate of defect-induced carrier recombination is determined by both defect levels and carrier capture cross-sections. Density functional theory (DFT) has been widely and successfully used to predict defect levels, but only recently has work begun to focus on using DFT to determine carrier capture cross-sections. Lang and Henry developed the theory of carrier-capture by multiphonon emission in the 1970s and showed that carrier-capture cross-sections differ between defects primarily due to differences in their carrier capture activation energies. We present an approach to using DFT to calculate carrier capture activation energies that does not depend on an assumed configuration coordinate and that fully accounts for anharmonic effects, which can substantially modify carrier activation energies. We demonstrate our approach for intrinisic defects in GaAs and GaN and discuss how our results depend on the choice of exchange-correlation functional and the treatment of spin polarization. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  1. Image simulation for automatic license plate recognition

    Science.gov (United States)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  2. Automatic Counting of Large Mammals from Very High Resolution Panchromatic Satellite Imagery

    NARCIS (Netherlands)

    Xue, Yifei; Wang, Tiejun; Skidmore, Andrew K.

    2017-01-01

    Estimating animal populations by direct counting is an essential component of wildlife conservation and management. However, conventional approaches (i.e., ground survey and aerial survey) have intrinsic constraints. Advances in image data capture and processing provide new opportunities for using

  3. Interactive Business Development, Capturing Business Knowledge and Practice: A Case Study

    Science.gov (United States)

    McKelvie, Gregor; Dotsika, Fefie; Patrick, Keith

    2007-01-01

    Purpose: The purpose of this paper is to follow the planning and development of MapaWiki, a Knowledge Management System for Mapa, an independent research company that specialises in competitor benchmarking. Starting with the standard requirements to capture, store and share information and knowledge, a system was sought that would allow growth and…

  4. Small Particles Intact Capture Experiment (SPICE)

    Science.gov (United States)

    Nishioka, Ken-Ji; Carle, G. C.; Bunch, T. E.; Mendez, David J.; Ryder, J. T.

    1994-01-01

    The Small Particles Intact Capture Experiment (SPICE) will develop technologies and engineering techniques necessary to capture nearly intact, uncontaminated cosmic and interplanetary dust particles (IDP's). Successful capture of such particles will benefit the exobiology and planetary science communities by providing particulate samples that may have survived unaltered since the formation of the solar system. Characterization of these particles may contribute fundamental data to our knowledge of how these particles could have formed into our planet Earth and, perhaps, contributed to the beginnings of life. The term 'uncontaminated' means that captured cosmic and IDP particles are free of organic contamination from the capture process and the term 'nearly intact capture' means that their chemical and elemental components are not materially altered during capture. The key to capturing cosmic and IDP particles that are organic-contamination free and nearly intact is the capture medium. Initial screening of capture media included organic foams, multiple thin foil layers, and aerogel (a silica gel); but, with the exception of aerogel, the requirements of no contamination or nearly intact capture were not met. To ensure no contamination of particles in the capture process, high-purity aerogel was chosen. High-purity aerogel results in high clarity (visual clearness), a useful quality in detection and recovery of embedded captured particles from the aerogel. P. Tsou at the Jet Propulsion Laboratory (JPL) originally described the use of aerogel for this purpose and reported laboratory test results. He has flown aerogel as a 'GAS-can Lid' payload on STS-47 and is evaluating the results. The Timeband Capture Cell Experiment (TICCE), a Eureca 1 experiment, is also flying aerogel and is scheduled for recovery in late April.

  5. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  6. Laser capture microdissection: Arcturus(XT) infrared capture and UV cutting methods.

    Science.gov (United States)

    Gallagher, Rosa I; Blakely, Steven R; Liotta, Lance A; Espina, Virginia

    2012-01-01

    Laser capture microdissection (LCM) is a technique that allows the precise procurement of enriched cell populations from a heterogeneous tissue under direct microscopic visualization. LCM can be used to harvest the cells of interest directly or can be used to isolate specific cells by ablating the unwanted cells, resulting in histologically enriched cell populations. The fundamental components of laser microdissection technology are (a) visualization of the cells of interest via microscopy, (b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and (c) removal of cells of interest from the heterogeneous tissue section. Laser energy supplied by LCM instruments can be infrared (810 nm) or ultraviolet (355 nm). Infrared lasers melt thermolabile polymers for cell capture, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes the unique features of the Arcturus(XT) laser capture microdissection instrument, which incorporates both infrared capture and ultraviolet cutting technology in one instrument, using a proteomic downstream assay as a model.

  7. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  8. The problem of latent attentional capture: Easy visual search conceals capture by task-irrelevant abrupt onsets.

    Science.gov (United States)

    Gaspelin, Nicholas; Ruthruff, Eric; Lien, Mei-Ching

    2016-08-01

    Researchers are sharply divided regarding whether irrelevant abrupt onsets capture spatial attention. Numerous studies report that they do and a roughly equal number report that they do not. This puzzle has inspired numerous attempts at reconciliation, none gaining general acceptance. The authors propose that abrupt onsets routinely capture attention, but the size of observed capture effects depends critically on how long attention dwells on distractor items which, in turn, depends critically on search difficulty. In a series of spatial cuing experiments, the authors show that irrelevant abrupt onsets produce robust capture effects when visual search is difficult, but not when search is easy. Critically, this effect occurs even when search difficulty varies randomly across trials, preventing any strategic adjustments of the attentional set that could modulate probability of capture by the onset cue. The authors argue that easy visual search provides an insensitive test for stimulus-driven capture by abrupt onsets: even though onsets truly capture attention, the effects of capture can be latent. This observation helps to explain previous failures to find capture by onsets, nearly all of which used an easy visual search. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Study on keV-neutron capture cross sections and capture γ-ray spectra of 117,119Sn

    International Nuclear Information System (INIS)

    Nishiyama, J.; Igashira, M.; Ohsaki, T.; Kim, G.N.; Chung, W.C.; Ro, T.I.

    2006-01-01

    The capture cross sections and capture γ-ray spectra of 117,119 Sn were measured in an incident neutron energy region from 10 to 100 keV and at 570 keV, using a 1.5-ns pulsed neutron source by the 7 Li(p,n) 7 Be reaction and a large anti-Compton NaI(Tl) γ-ray spectrometer. A pulse-height weighting technique was applied to observed capture γ-ray pulse-height spectra to derive capture yields. The capture cross sections of 117,119 Sn were obtained with the error of about 5% by using the standard capture cross sections of 197 Au. The present cross sections were compared with previous experimental data and the evaluated values in JENDL-3.3 and ENDF/B-VI. The capture γ-ray spectra of 117,119 Sn were derived by unfolding the observed capture γ-ray pulse-height spectra. The calculations of capture cross sections and capture γ-ray spectra of 117,119 Sn were performed with the EMPIRE-II code. The calculated results were compared with the present experimental ones. (author)

  10. A training approach to improve stepping automaticity while dual-tasking in Parkinson's disease

    Science.gov (United States)

    Chomiak, Taylor; Watts, Alexander; Meyer, Nicole; Pereira, Fernando V.; Hu, Bin

    2017-01-01

    Abstract Background: Deficits in motor movement automaticity in Parkinson's disease (PD), especially during multitasking, are early and consistent hallmarks of cognitive function decline, which increases fall risk and reduces quality of life. This study aimed to test the feasibility and potential efficacy of a wearable sensor-enabled technological platform designed for an in-home music-contingent stepping-in-place (SIP) training program to improve step automaticity during dual-tasking (DT). Methods: This was a 4-week prospective intervention pilot study. The intervention uses a sensor system and algorithm that runs off the iPod Touch which calculates step height (SH) in real-time. These measurements were then used to trigger auditory (treatment group, music; control group, radio podcast) playback in real-time through wireless headphones upon maintenance of repeated large amplitude stepping. With small steps or shuffling, auditory playback stops, thus allowing participants to use anticipatory motor control to regain positive feedback. Eleven participants were recruited from an ongoing trial (Trial Number: ISRCTN06023392). Fear of falling (FES-I), general cognitive functioning (MoCA), self-reported freezing of gait (FOG-Q), and DT step automaticity were evaluated. Results: While we found no significant effect of training on FES-I, MoCA, or FOG-Q, we did observe a significant group (music vs podcast) by training interaction in DT step automaticity (Ptraining to increase motor automaticity for people living with PD. The training approach described here can be implemented at home to meet the growing demand for self-management of symptoms by patients. PMID:28151878

  11. The Moral Capture of "Being Good"

    DEFF Research Database (Denmark)

    Kjærgaard, Annemette Leonhardt; Morsing, Mette

    layer of institutional control for identity work that emerges beyond managerial influence, as employees as well as managers are morally inclined to comply with the corporate CSR promise of “being good”. Importantly, our findings show that members comply with the CSR message in four ways that include...... devotion but also suppression of overt forms of critique and resistance. We refer to these four compliance modes as the “moral capture of CSR”. We discuss the implications of compliance to CSR as a form of control of identity work, as we propose that CSR images “captivate” member identity in discursive...... closure and impede future development of CSR....

  12. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  13. Automatic Hidden-Web Table Interpretation by Sibling Page Comparison

    Science.gov (United States)

    Tao, Cui; Embley, David W.

    The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.

  14. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  15. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  16. A Wireless Sensor Network-Based Ubiquitous Paprika Growth Management System

    Science.gov (United States)

    Hwang, Jeonghwan; Shin, Changsun; Yoe, Hyun

    2010-01-01

    Wireless Sensor Network (WSN) technology can facilitate advances in productivity, safety and human quality of life through its applications in various industries. In particular, the application of WSN technology to the agricultural area, which is labor-intensive compared to other industries, and in addition is typically lacking in IT technology applications, adds value and can increase the agricultural productivity. This study attempts to establish a ubiquitous agricultural environment and improve the productivity of farms that grow paprika by suggesting a ‘Ubiquitous Paprika Greenhouse Management System’ using WSN technology. The proposed system can collect and monitor information related to the growth environment of crops outside and inside paprika greenhouses by installing WSN sensors and monitoring images captured by CCTV cameras. In addition, the system provides a paprika greenhouse environment control facility for manual and automatic control from a distance, improves the convenience and productivity of users, and facilitates an optimized environment to grow paprika based on the growth environment data acquired by operating the system. PMID:22163543

  17. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  18. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  19. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    Regional hydrological models are important tools in water resources management. Model prediction uncertainty is primarily due to structural (geological) non-uniqueness which makes sampling of the structural model space necessary to estimate prediction uncertainties. Geological structures and hete...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  20. Enhancing the Value of Sensor-based Observations by Capturing the Knowledge of How An Observation Came to Be

    Science.gov (United States)

    Fredericks, J.; Rueda-Velasquez, C. A.

    2016-12-01

    As we move from keeping data on our disks to sharing it with the world, often in real-time, we are obligated to also tell an unknown user about how our observations were made. Data that are shared must not only have ownership metadata, unit descriptions and content formatting information. The provider must also share information that is needed to assess the data as it relates to potential re-use. A user must be able to assess the limitations and capabilities of the sensor, as it is configured, to understand its value. For example, when an instrument is configured, it typically affects the data accuracy and operational limits of the sensor. An operator may sacrifice data accuracy to achieve a broader operational range and visa versa. If you are looking at newly discovered data, it is important to be able to find all of the information that relates to assessing the data quality for your particular application. Traditionally, metadata are captured by data managers who usually do not know how the data are collected. By the time data are distributed, this knowledge is often gone, buried within notebooks or hidden in documents that are not machine-harvestable and often not human-readable. In a recently funded NSF EarthCube Integrative Activity called X-DOMES (Cross-Domain Observational Metadata in EnviroSensing), mechanisms are underway to enable the capture of sensor and deployment metadata by sensor manufacturers and field operators. The support has enabled the development of a community ontology repository (COR) within the Earth Science Information Partnership (ESIP) community, fostering easy creation of resolvable terms for the broader community. This tool enables non-experts to easily develop W3C standards-based content, promoting the implementation of Semantic Web technologies for enhanced discovery of content and interoperability in workflows. The X-DOMES project is also developing a SensorML Viewer/Editor to provide an easy interface for sensor manufacturers and

  1. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  2. Using Probe Vehicle Data for Automatic Extraction of Road Traffic Parameters

    Directory of Open Access Journals (Sweden)

    Roman Popescu Maria Alexandra

    2016-12-01

    Full Text Available Through this paper the author aims to study and find solutions for automatic detection of traffic light position and for automatic calculation of the waiting time at traffic light. The first objective serves mainly the road transportation field, mainly because it removes the need for collaboration with local authorities to establish a national network of traffic lights. The second objective is important not only for companies which are providing navigation solutions, but especially for authorities, institutions, companies operating in road traffic management systems. Real-time dynamic determination of traffic queue length and of waiting time at traffic lights allow the creation of dynamic systems, intelligent and flexible, adapted to actual traffic conditions, and not to generic, theoretical models. Thus, cities can approach the Smart City concept by boosting, efficienting and greening the road transport, promoted in Europe through the Horizon 2020, Smart Cities, Urban Mobility initiative.

  3. Marker-Free Human Motion Capture

    DEFF Research Database (Denmark)

    Grest, Daniel

    Human Motion Capture is a widely used technique to obtain motion data for animation of virtual characters. Commercial optical motion capture systems are marker-based. This book is about marker-free motion capture and its possibilities to acquire motion from a single viewing direction. The focus...

  4. Monitoring compliance with transfusion guidelines in hospital departments by electronic data capture

    DEFF Research Database (Denmark)

    Norgaard, Astrid; De Lichtenberg, Trine Honnens; Nielsen, Jens

    2014-01-01

    -transfusion haemoglobin levels at the departmental level. In a tertiary care hospital, no such data were produced before this study. Our aim was to establish a Patient Blood Management database based on electronic data capture in order to monitor compliance with transfusion guidelines at departmental and hospital levels...

  5. [An automatic system for anatomophysiological correlation in three planes simultaneously during functional neurosurgery].

    Science.gov (United States)

    Teijeiro, E J; Macías, R J; Morales, J M; Guerra, E; López, G; Alvarez, L M; Fernández, F; Maragoto, C; Seijo, F; Alvarez, E

    The Neurosurgical Deep Recording System (NDRS) using a personal computer takes the place of complex electronic equipment for recording and processing deep cerebral electrical activity, as a guide in stereotaxic functional neurosurgery. It also permits increased possibilities of presenting information in direct graphic form with automatic management and sufficient flexibility to implement different analyses. This paper describes the possibilities of automatic simultaneous graphic representation in three almost orthogonal planes, available with the new 5.1 version of NDRS so as to facilitate the analysis of anatomophysiological correlation in the localization of deep structures of the brain during minimal access surgery. This new version can automatically show the spatial behaviour of signals registered throughout the path of the electrode inside the brain, superimposed simultaneously on sagittal, coronal and axial sections of an anatomical atlas of the brain, after adjusting the scale automatically according to the dimensions of the brain of each individual patient. This may also be shown in a tridimensional representation of the different planes themselves intercepting. The NDRS system has been successfully used in Spain and Cuba in over 300 functional neurosurgery operations. The new version further facilitates analysis of spatial anatomophysiological correlation for the localization of brain structures. This system has contributed to increase the precision and safety in selecting surgical targets in the control of Parkinson s disease and other disorders of movement.

  6. Design of patient rooms and automatic radioiodine-131 waste water management system for a thyroid cancer treatment ward: 'Suandok Model'.

    Science.gov (United States)

    Vilasdechanon, N; Ua-Apisitwong, S; Chatnampet, K; Ekmahachai, M; Vilasdechanon, J

    2014-09-01

    The great benefit of (131)I radionuclide treatment for differentiated thyroid cancer (DTC) was acknowledged by the long survival rate. The main requirements for (131)I therapy in hospital were treatment facilities and a radiation safety plan that assured radiation protection and safety to patient, hospital worker, public, and environment. To introduce the concepts and methods of radiation safety design for a patient's room in a (131)I treatment ward and a system of radioactive waste water management in hospital. The design was based on principles of external and internal radiation protection for unsealed source and radioactive waste management. Planning for treatment facilities was concluded from clinical evidence, physical and physiological information for (131)I, radiation safety criteria, hospital resources and budget. The three phases of the working process were: construction, software development, and radiation safety assessment. The (131)I treatment facility and automatic radioactive waste water management system was completely implemented in 2009. The radiation waste water management system known as the 'Suandok Model' was highly recommended by the national regulator to hospitals who desire to provide (131)I treatment for thyroid cancer. In 2011, the Nuclear Medicine Division, Chiang Mai University was rewarded by the national authority for a very good radiation practice in development of safe working conditions and environment. The Suandok Model was a facility design that fulfilled requirements for the safe use of high radiation (131)I doses for thyroid cancer treatment in hospital. The facility presented in this study may not be suitable for all hospitals but the design concepts could be applied according to an individual hospital context and resources. People who use or gain benefit from radiation applications have to emphasise the responsibility to control and monitor radiation effects on individuals, communities and the environment.

  7. Stimulus-driven capture and contingent capture

    NARCIS (Netherlands)

    Theeuwes, J.; Olivers, C.N.L.; Belopolsky, A.V.

    2010-01-01

    Whether or not certain physical events can capture attention has been one of the most debated issues in the study of attention. This discussion is concerned with how goal-directed and stimulus-driven processes interact in perception and cognition. On one extreme of the spectrum is the idea that

  8. Using a Motion Capture System for Spatial Localization of EEG Electrodes.

    Directory of Open Access Journals (Sweden)

    Pedro eReis

    2015-04-01

    Full Text Available Electroencephalography (EEG is often used in source analysis studies, in which the locations of cortex regions responsible for a signal are determined. For this to be possible, accurate positions of the electrodes at the scalp surface must be determined, otherwise errors in the source estimation will occur. Today, several methods for acquiring these positions exist but they are often not satisfyingly accurate or take a long time to perform. Therefore, in this paper we describe a method capable of determining the positions accurately and fast.This method uses an infrared light motion capture system (IR-MOCAP with 8 cameras arranged around a human participant. It acquires 3D coordinates of each electrode and automatically labels them. Each electrode has a small reflector on top of it thus allowing its detection by the cameras. We tested the accuracy of the presented method by acquiring the electrodes positions on a rigid sphere model and comparing these with measurements from computer tomography (CT. The average Euclidean distance between the sphere model CT measurements and the presented method was 1.23 mm with an average standard deviation of 0.51 mm. We also tested the method with a human participant. The measurement was quickly performed and all positions were captured.These results tell that, with this method, it is possible to acquire electrode positions with minimal error and little time effort for the study participants and investigators.

  9. Prey capture behaviour evoked by simple visual stimuli in larval zebrafish

    Directory of Open Access Journals (Sweden)

    Isaac Henry Bianco

    2011-12-01

    Full Text Available Understanding how the nervous system recognises salient stimuli in the environ- ment and selects and executes the appropriate behavioural responses is a fundamen- tal question in systems neuroscience. To facilitate the neuroethological study of visually-guided behaviour in larval zebrafish, we developed virtual reality assays in which precisely controlled visual cues can be presented to larvae whilst their behaviour is automatically monitored using machine-vision algorithms. Freely swimming larvae responded to moving stimuli in a size-dependent manner: they directed multiple low amplitude orienting turns (∼ 20◦ towards small moving spots (1◦ but reacted to larger spots (10◦ with high-amplitude aversive turns (∼ 60◦. The tracking of small spots led us to examine how larvae respond to prey during hunting routines. By analysing movie sequences of larvae hunting parame- cia, we discovered that all prey capture routines commence with eye convergence and larvae maintain their eyes in a highly converged position for the duration of the prey-tracking and capture swim phases. We adapted our virtual reality assay to deliver artificial visual cues to partially restrained larvae and found that small moving spots evoked convergent eye movements and J-turns of the tail, which are defining features of natural hunting. We propose that eye convergence represents the engagement of a predatory mode of behaviour in larval fish and serves to increase the region of binocular visual space to enable stereoscopic targeting of prey.

  10. Building the Knowledge Base to Support the Automatic Animation Generation of Chinese Traditional Architecture

    Science.gov (United States)

    Wei, Gongjin; Bai, Weijing; Yin, Meifang; Zhang, Songmao

    We present a practice of applying the Semantic Web technologies in the domain of Chinese traditional architecture. A knowledge base consisting of one ontology and four rule bases is built to support the automatic generation of animations that demonstrate the construction of various Chinese timber structures based on the user's input. Different Semantic Web formalisms are used, e.g., OWL DL, SWRL and Jess, to capture the domain knowledge, including the wooden components needed for a given building, construction sequence, and the 3D size and position of every piece of wood. Our experience in exploiting the current Semantic Web technologies in real-world application systems indicates their prominent advantages (such as the reasoning facilities and modeling tools) as well as the limitations (such as low efficiency).

  11. TFTR data management system

    International Nuclear Information System (INIS)

    Randerson, L.; Chu, J.; Ludescher, C.; Malsbury, J.; Stark, W.

    1986-01-01

    Developments in the tokamak fusion test reactor (TFTR) data management system supporting data management system supporting data acquisition and off-line physics data reduction are described. Data from monitor points, timing channels, and transient recorder channels and other devices are acquired and stored for use by on-line tasks. Files are transferred off-line automatically. A configuration utility determines data acquired and files transferred. An event system driven by file arrival activates off-line reduction processes. A post-run process transfers files not shipped during runs. Files are archived to tape and are retrievable by digraph and shot number. Automatic skimming based on most recent access, file type, shot numbers, and user-set protection maintains the files required for post-run data reduction

  12. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  13. Semi-automatic parking slot marking recognition for intelligent parking assist systems

    Directory of Open Access Journals (Sweden)

    Ho Gi Jung

    2014-01-01

    Full Text Available This paper proposes a semi-automatic parking slot marking-based target position designation method for parking assist systems in cases where the parking slot markings are of a rectangular type, and its efficient implementation for real-time operation. After the driver observes a rearview image captured by a rearward camera installed at the rear of the vehicle through a touchscreen-based human machine interface, a target parking position is designated by touching the inside of a parking slot. To ensure the proposed method operates in real-time in an embedded environment, access of the bird's-eye view image is made efficient: image-wise batch transformation is replaced with pixel-wise instantaneous transformation. The proposed method showed a 95.5% recognition rate in 378 test cases with 63 test images. Additionally, experiments confirmed that the pixel-wise instantaneous transformation reduced execution time by 92%.

  14. Space Shuttle Guidance, Navigation, and Rendezvous Knowledge Capture Reports. Revision 1

    Science.gov (United States)

    Goodman, John L.

    2011-01-01

    This document is a catalog and readers guide to lessons learned, experience, and technical history reports, as well as compilation volumes prepared by United Space Alliance personnel for the NASA/Johnson Space Center (JSC) Flight Dynamics Division.1 It is intended to make it easier for future generations of engineers to locate knowledge capture documentation from the Shuttle Program. The first chapter covers observations on documentation quality and research challenges encountered during the Space Shuttle and Orion programs. The second chapter covers the knowledge capture approach used to create many of the reports covered in this document. These chapters are intended to provide future flight programs with insight that could be used to formulate knowledge capture and management strategies. The following chapters contain descriptions of each knowledge capture report. The majority of the reports concern the Space Shuttle. Three are included that were written in support of the Orion Program. Most of the reports were written from the years 2001 to 2011. Lessons learned reports concern primarily the shuttle Global Positioning System (GPS) upgrade and the knowledge capture process. Experience reports on navigation and rendezvous provide examples of how challenges were overcome and how best practices were identified and applied. Some reports are of a more technical history nature covering navigation and rendezvous. They provide an overview of mission activities and the evolution of operations concepts and trajectory design. The lessons learned, experience, and history reports would be considered secondary sources by historians and archivists.

  15. Resource capture by single leaves

    Energy Technology Data Exchange (ETDEWEB)

    Long, S.P.

    1992-05-01

    Leaves show a variety of strategies for maximizing CO{sub 2} and light capture. These are more meaningfully explained if they are considered in the context of maximizing capture relative to the utilization of water, nutrients and carbohydrates reserves. There is considerable variation between crops in their efficiency of CO{sub 2} and light capture at the leaf level. Understanding of these mechanisms indicate some ways in which efficiency of resource capture could be level cannot be meaningfully considered without simultaneous understanding of implications at the canopy level. 36 refs., 5 figs., 1 tab.

  16. Visual content highlighting via automatic extraction of embedded captions on MPEG compressed video

    Science.gov (United States)

    Yeo, Boon-Lock; Liu, Bede

    1996-03-01

    Embedded captions in TV programs such as news broadcasts, documentaries and coverage of sports events provide important information on the underlying events. In digital video libraries, such captions represent a highly condensed form of key information on the contents of the video. In this paper we propose a scheme to automatically detect the presence of captions embedded in video frames. The proposed method operates on reduced image sequences which are efficiently reconstructed from compressed MPEG video and thus does not require full frame decompression. The detection, extraction and analysis of embedded captions help to capture the highlights of visual contents in video documents for better organization of video, to present succinctly the important messages embedded in the images, and to facilitate browsing, searching and retrieval of relevant clips.

  17. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  18. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  19. The Development of Automatic Sequences for the RF and Cryogenic Systems at the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Gurd, Pamela; Casagrande, Fabio; Mccarthy, Michael; Strong, William; Ganni, Venkatarao

    2005-01-01

    Automatic sequences both ease the task of operating a complex machine and ensure procedural consistency. At the Spallation Neutron Source project (SNS), a set of automatic sequences have been developed to perform the start up and shut down of the high power RF systems. Similarly, sequences have been developed to perform backfill, pump down, automatic valve control and energy management in the cryogenic system. The sequences run on Linux soft input-output controllers (IOCs), which are similar to ordinary EPICS (Experimental Physics and Industrial Control System) IOCs in terms of data sharing with other EPICS processes, but which share a Linux processor with other such processors. Each sequence waits for a command from an operator console and starts the corresponding set of instructions, allowing operators to follow the sequences either from an overview screen or from detail screens. We describe each system and our operational experience with it.

  20. Automatic Enhancement of the Reference Set for Multi-Criteria Sorting in The Frame of Theseus Method

    Directory of Open Access Journals (Sweden)

    Fernandez Eduardo

    2014-05-01

    Full Text Available Some recent works have established the importance of handling abundant reference information in multi-criteria sorting problems. More valid information allows a better characterization of the agent’s assignment policy, which can lead to an improved decision support. However, sometimes information for enhancing the reference set may be not available, or may be too expensive. This paper explores an automatic mode of enhancing the reference set in the framework of the THESEUS multi-criteria sorting method. Some performance measures are defined in order to test results of the enhancement. Several theoretical arguments and practical experiments are provided here, supporting a basic advantage of the automatic enhancement: a reduction of the vagueness measure that improves the THESEUS accuracy, without additional efforts from the decision agent. The experiments suggest that the errors coming from inadequate automatic assignments can be kept at a manageable level.

  1. Measurement of keV-neutron capture cross sections and capture gamma-ray spectra of Er isotopes

    International Nuclear Information System (INIS)

    Harun-Ar-Rashid, A.K.M.; Igashira, Masayuki; Ohsaki, Toshiro

    2000-01-01

    Neutron capture cross sections and capture γ-ray spectra of 166,167, 168 Er were measured in the energy region of 10 to 550 keV. The measurements were performed with a pulsed 7 Li(p,n) 7 Be neutron source and a large anti-Compton NaI(Tl) γ-ray spectrometer. A pulse-height weighting technique and the standard capture cross sections of gold were used to derive the capture cross sections. The errors of the derived cross sections were about 5%. The present results were compared with other measurements and evaluations. The observed capture γ-ray pulse-height spectra were unfolded to obtain the corresponding γ-ray spectra. An anomalous shoulder was observed around 3 MeV in each of the capture γ-ray spectra. (author)

  2. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  3. An architectural framework for developing intelligent applications for the carbon dioxide capture process

    Energy Technology Data Exchange (ETDEWEB)

    Luo, C.; Zhou, Q.; Chan, C.W. [Regina Univ., SK (Canada)

    2009-07-01

    This presentation reported on the development of automated application solutions for the carbon dioxide (CO{sub 2}) capture process. An architectural framework was presented for developing intelligent systems for the process system. The chemical absorption process consists of dozens of components. It therefore generates more than a hundred different types of data. Developing automated support for these tasks is desirable because the monitoring, analysis and diagnosis of the data is very complex. The proposed framework interacts with an implemented domain ontology for the CO{sub 2} capture process, which consists of information derived from senior operators of the CO{sub 2} pilot plant at the International Test Centre for Carbon Dioxide Capture at University of Regina. The well-defined library within the framework reduces development time and cost. The framework also has built-in web-based software components for data monitoring, management, and analysis. These components provide support for generating automated solutions for the CO{sub 2} capture process. An automated monitoring system that was also developed based on the architectural framework.

  4. Client/server approach to image capturing

    Science.gov (United States)

    Tuijn, Chris; Stokes, Earle

    1998-01-01

    The diversity of the digital image capturing devices on the market today is quite astonishing and ranges from low-cost CCD scanners to digital cameras (for both action and stand-still scenes), mid-end CCD scanners for desktop publishing and pre- press applications and high-end CCD flatbed scanners and drum- scanners with photo multiplier technology. Each device and market segment has its own specific needs which explains the diversity of the associated scanner applications. What all those applications have in common is the need to communicate with a particular device to import the digital images; after the import, additional image processing might be needed as well as color management operations. Although the specific requirements for all of these applications might differ considerably, a number of image capturing and color management facilities as well as other services are needed which can be shared. In this paper, we propose a client/server architecture for scanning and image editing applications which can be used as a common component for all these applications. One of the principal components of the scan server is the input capturing module. The specification of the input jobs is based on a generic input device model. Through this model we make abstraction of the specific scanner parameters and define the scan job definitions by a number of absolute parameters. As a result, scan job definitions will be less dependent on a particular scanner and have a more universal meaning. In this context, we also elaborate on the interaction of the generic parameters and the color characterization (i.e., the ICC profile). Other topics that are covered are the scheduling and parallel processing capabilities of the server, the image processing facilities, the interaction with the ICC engine, the communication facilities (both in-memory and over the network) and the different client architectures (stand-alone applications, TWAIN servers, plug-ins, OLE or Apple-event driven

  5. Stress evaluation in hares (Lepus europaeus Pallas captured for traslocation

    Directory of Open Access Journals (Sweden)

    Antonio Lavazza

    2010-01-01

    SE resulted as fol- lows: estimated non-stressed hares, glucose 234 ± 9 .4 mg/dl, AST 112 ± 22.2 U/l, CPK 1334 ± 734 U/l; estimated stressed hares, glucose 128 ± 7 mg/dl, AST 164 ± 13 U/l, CPK 4658 ± 454 U/l. These three cheap and quickly analysable analytes can be useful to the game manager in detecting stressed and non stressed hares, in order to improve the capturing techniques by the evaluation of the following relationship: (number of stressed hares + number of the dead hares during the capture/number of total captured hares.

  6. Automatic provisioning, deployment and orchestration for load-balancing THREDDS instances

    Science.gov (United States)

    Cofino, A. S.; Fernández-Tejería, S.; Kershaw, P.; Cimadevilla, E.; Petri, R.; Pryor, M.; Stephens, A.; Herrera, S.

    2017-12-01

    THREDDS is a widely used web server to provide to different scientific communities with data access and discovery. Due to THREDDS's lack of horizontal scalability and automatic configuration management and deployment, this service usually deals with service downtimes and time consuming configuration tasks, mainly when an intensive use is done as is usual within the scientific community (e.g. climate). Instead of the typical installation and configuration of a single or multiple independent THREDDS servers, manually configured, this work presents an automatic provisioning, deployment and orchestration cluster of THREDDS servers. This solution it's based on Ansible playbooks, used to control automatically the deployment and configuration setup on a infrastructure and to manage the datasets available in THREDDS instances. The playbooks are based on modules (or roles) of different backends and frontends load-balancing setups and solutions. The frontend load-balancing system enables horizontal scalability by delegating requests to backend workers, consisting in a variable number of instances for the THREDDS server. This implementation allows to configure different infrastructure and deployment scenario setups, as more workers are easily added to the cluster by simply declaring them as Ansible variables and executing the playbooks, and also provides fault-tolerance and better reliability since if any of the workers fail another instance of the cluster can take over it. In order to test the solution proposed, two real scenarios are analyzed in this contribution: The JASMIN Group Workspaces at CEDA and the User Data Gateway (UDG) at the Data Climate Service from the University of Cantabria. On the one hand, the proposed configuration has provided CEDA with a higher level and more scalable Group Workspaces (GWS) service than the previous one based on Unix permissions, improving also the data discovery and data access experience. On the other hand, the UDG has improved its

  7. Multimodal Translation System Using Texture-Mapped Lip-Sync Images for Video Mail and Automatic Dubbing Applications

    Science.gov (United States)

    Morishima, Shigeo; Nakamura, Satoshi

    2004-12-01

    We introduce a multimodal English-to-Japanese and Japanese-to-English translation system that also translates the speaker's speech motion by synchronizing it to the translated speech. This system also introduces both a face synthesis technique that can generate any viseme lip shape and a face tracking technique that can estimate the original position and rotation of a speaker's face in an image sequence. To retain the speaker's facial expression, we substitute only the speech organ's image with the synthesized one, which is made by a 3D wire-frame model that is adaptable to any speaker. Our approach provides translated image synthesis with an extremely small database. The tracking motion of the face from a video image is performed by template matching. In this system, the translation and rotation of the face are detected by using a 3D personal face model whose texture is captured from a video frame. We also propose a method to customize the personal face model by using our GUI tool. By combining these techniques and the translated voice synthesis technique, an automatic multimodal translation can be achieved that is suitable for video mail or automatic dubbing systems into other languages.

  8. Integrated image data and medical record management for rare disease registries. A general framework and its instantiation to theGerman Calciphylaxis Registry.

    Science.gov (United States)

    Deserno, Thomas M; Haak, Daniel; Brandenburg, Vincent; Deserno, Verena; Classen, Christoph; Specht, Paula

    2014-12-01

    Especially for investigator-initiated research at universities and academic institutions, Internet-based rare disease registries (RDR) are required that integrate electronic data capture (EDC) with automatic image analysis or manual image annotation. We propose a modular framework merging alpha-numerical and binary data capture. In concordance with the Office of Rare Diseases Research recommendations, a requirement analysis was performed based on several RDR databases currently hosted at Uniklinik RWTH Aachen, Germany. With respect to the study management tool that is already successfully operating at the Clinical Trial Center Aachen, the Google Web Toolkit was chosen with Hibernate and Gilead connecting a MySQL database management system. Image and signal data integration and processing is supported by Apache Commons FileUpload-Library and ImageJ-based Java code, respectively. As a proof of concept, the framework is instantiated to the German Calciphylaxis Registry. The framework is composed of five mandatory core modules: (1) Data Core, (2) EDC, (3) Access Control, (4) Audit Trail, and (5) Terminology as well as six optional modules: (6) Binary Large Object (BLOB), (7) BLOB Analysis, (8) Standard Operation Procedure, (9) Communication, (10) Pseudonymization, and (11) Biorepository. Modules 1-7 are implemented in the German Calciphylaxis Registry. The proposed RDR framework is easily instantiated and directly integrates image management and analysis. As open source software, it may assist improved data collection and analysis of rare diseases in near future.

  9. Attentional capture under high perceptual load.

    Science.gov (United States)

    Cosman, Joshua D; Vecera, Shaun P

    2010-12-01

    Attentional capture by abrupt onsets can be modulated by several factors, including the complexity, or perceptual load, of a scene. We have recently demonstrated that observers are less likely to be captured by abruptly appearing, task-irrelevant stimuli when they perform a search that is high, as opposed to low, in perceptual load (Cosman & Vecera, 2009), consistent with perceptual load theory. However, recent results indicate that onset frequency can influence stimulus-driven capture, with infrequent onsets capturing attention more often than did frequent onsets. Importantly, in our previous task, an abrupt onset was present on every trial, and consequently, attentional capture might have been affected by both onset frequency and perceptual load. In the present experiment, we examined whether onset frequency influences attentional capture under conditions of high perceptual load. When onsets were presented frequently, we replicated our earlier results; attentional capture by onsets was modulated under conditions of high perceptual load. Importantly, however, when onsets were presented infrequently, we observed robust capture effects. These results conflict with a strong form of load theory and, instead, suggest that exposure to the elements of a task (e.g., abrupt onsets) combines with high perceptual load to modulate attentional capture by task-irrelevant information.

  10. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  11. Clinical considerations for neutron capture therapy of brain tumors

    International Nuclear Information System (INIS)

    Madoc-Jones, H.; Wazer, D.E.; Zamenhof, R.G.; Harling, O.K.; Bernard, J.A. Jr.

    1990-01-01

    The radiotherapeutic management of primary brain tumors and metastatic melanoma in brain has had disappointing clinical results for many years. Although neutron capture therapy was tried in the US in the 1950s and 1960s, the results were not as hoped. However, with the newly developed capability to measure boron concentrations in blood and tissue both quickly and accurately, and with the advent of epithermal neutron beams obviating the need for scalp and skull reflection, it should not be possible to mount such a clinical trial of NCT again and avoid serious complications. As a prerequisite, it will be important to demonstrate the differential uptake of boron compound in brain tumor as compared with normal brain and its blood supply. If this can be done, then a trial of boron neutron capture therapy for brain tumors should be feasible. Because boronated phenylalanine has been demonstrated to be preferentially taken up by melanoma cells through the biosynthetic pathway for melanin, there is special interest in a trial of boron neutron capture therapy for metastatic melanoma in brain. Again, the use of an epithermal beam would make this a practical possibility. However, because any epithermal (or thermal) beam must contain a certain contaminating level of gamma rays, and because even a pure neutron beam cases gamma rays to be generated when it interacts with tissue, they think that it is essential to deliver treatments with an epithermal beam for boron neutron capture therapy in fractions in order to minimize the late-effects of low-LET gamma rays in the normal tissue

  12. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  13. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  14. Simulation of mercury capture by sorbent injection using a simplified model.

    Science.gov (United States)

    Zhao, Bingtao; Zhang, Zhongxiao; Jin, Jing; Pan, Wei-Ping

    2009-10-30

    Mercury pollution by fossil fuel combustion or solid waste incineration is becoming the worldwide environmental concern. As an effective control technology, powdered sorbent injection (PSI) has been successfully used for mercury capture from flue gas with advantages of low cost and easy operation. In order to predict the mercury capture efficiency for PSI more conveniently, a simplified model, which is based on the theory of mass transfer, isothermal adsorption and mass balance, is developed in this paper. The comparisons between theoretical results of this model and experimental results by Meserole et al. [F.B. Meserole, R. Chang, T.R. Carrey, J. Machac, C.F.J. Richardson, Modeling mercury removal by sorbent injection, J. Air Waste Manage. Assoc. 49 (1999) 694-704] demonstrate that the simplified model is able to provide good predictive accuracy. Moreover, the effects of key parameters including the mass transfer coefficient, sorbent concentration, sorbent physical property and sorbent adsorption capacity on mercury adsorption efficiency are compared and evaluated. Finally, the sensitive analysis of impact factor indicates that the injected sorbent concentration plays most important role for mercury capture efficiency.

  15. 49 CFR 236.825 - System, automatic train control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  16. Exploratory investigations of hypervelocity intact capture spectroscopy

    Science.gov (United States)

    Tsou, P.; Griffiths, D. J.

    1993-01-01

    The ability to capture hypervelocity projectiles intact opens a new technique available for hypervelocity research. A determination of the reactions taking place between the projectile and the capture medium during the process of intact capture is extremely important to an understanding of the intact capture phenomenon, to improving the capture technique, and to developing a theory describing the phenomenon. The intact capture of hypervelocity projectiles by underdense media generates spectra, characteristic of the material species of projectile and capture medium involved. Initial exploratory results into real-time characterization of hypervelocity intact capture techniques by spectroscopy include ultra-violet and visible spectra obtained by use of reflecting gratings, transmitting gratings, and prisms, and recorded by photographic and electronic means. Spectrometry proved to be a valuable real-time diagnostic tool for hypervelocity intact capture events, offering understanding of the interactions of the projectile and the capture medium during the initial period and providing information not obtainable by other characterizations. Preliminary results and analyses of spectra produced by the intact capture of hypervelocity aluminum spheres in polyethylene (PE), polystyrene (PS), and polyurethane (PU) foams are presented. Included are tentative emission species identifications, as well as gray body temperatures produced in the intact capture process.

  17. Capturing the multiple benefits associated with nature-based solutions: lessons from natural flood management project in the Cotswolds, UK

    Science.gov (United States)

    Short, Chrisopher; Clarke, Lucy; Uttley, Chris; Smith, Brian

    2017-04-01

    co-management and suggest how this type of framework is suitable for a range of nature-based solutions across Europe. However, the challenge remains of capturing the multiple-benefits that such projects offer as these are often missed through conventional approaches such as cost-benefit analysis and some reflections on this will also be presented along with a potential way forward.

  18. Applications Of A Low Cost System For Industrial Automatic Inspection

    Science.gov (United States)

    Krey, C.; Ayache, A.; Bruel, A.

    1987-05-01

    In industrial environment, some repetitive tasks wich do not need a high degree of understanding, can be solved automatically owing to Vision. Among the systems available on the market, most of them are rather expensive with various capabilities. The described system is a modular system, built with some standard circuit boards. One of the advantages of this system is that its architecture can be redefined for each application, by assembling judiciously the standard modules. The vision system has been used successfully to sort fruits according to their colour and diameter. The system can sort 8 fruits per second on each sorting line and manage simultaneously up to 16 lines. An application of sheep skin cutting has been implemented too. After chemical and mechanical treatments, the skins present many defaults all around their contour, that must be cut off. A movable camera follows and inspects the contour ; the vision system determines where the cutting device must cut the skin. A third application has been implemented ; it concerns automatic recording and reproduction of logotypes. A moving camera driven by the system picks up the points, of the logotype contours. Before reproduction, programs can modify the logotypes shape, change the scale, and so on. For every application, the system uses the world smallest CCD camera developped in the laboratory. The small dimensions of the vision system and its low cost are major advantages for a wide use in industrial automatic inspection.

  19. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  20. U.S. Spacesuit Knowledge Capture Status and Initiatives

    Science.gov (United States)

    Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen

    2012-01-01

    The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA's history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.

  1. What Determines State Capture in Poland?

    Directory of Open Access Journals (Sweden)

    Stanisław Alwasiak

    2013-12-01

    Full Text Available Purpose: This study examines the determinants of ex-ante state capture in Poland. Methodology: In order to establish the determinants of ex-ante state capture a logistic regression is estimated. Findings: The study shows that in Poland the majority of legal acts were passed with the aim to satisfy the interest of particular groups. Furthermore, the regression analysis shows that the likelihood of state capture increases during the period of higher economic growth and local elections. The likelihood of state capture, however, declines during presidential elections. The results we attribute to different interests of political parties in the period of local and presidential elections. Finally, we fi nd that the state capture increased over the years in Poland. Additionally, we show that the EU accession did not prevent state capture in Poland. In contrast, the fi nancial crisis of 2007 resulted in a wake-up effect and the likelihood of state capture declined in Poland. Research limitations: In the study we employ proxies for state capture, yet we assume that corruption is a widespread phenomenon in Poland. However, due to its nature corruption is very diffi cult to assess and measure. Originality: The study uses a unique dataset on ex-ante state capture that was identifi ed in the legal acts that have been passed in the period 1990–2011 in Poland.

  2. Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design

    Science.gov (United States)

    Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.

    1991-01-01

    Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.

  3. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  4. The Use of Automatic Indexing for Authority Control.

    Science.gov (United States)

    Dillon, Martin; And Others

    1981-01-01

    Uses an experimental system for authority control on a collection of bibliographic records to demonstrate the resemblance between thesaurus-based automatic indexing and automatic authority control. Details of the automatic indexing system are given, results discussed, and the benefits of the resemblance examined. Included are a rules appendix and…

  5. 30 CFR 77.1401 - Automatic controls and brakes.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic controls and brakes. 77.1401 Section... MINES Personnel Hoisting § 77.1401 Automatic controls and brakes. Hoists and elevators shall be equipped with overspeed, overwind, and automatic stop controls and with brakes capable of stopping the elevator...

  6. 30 CFR 57.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 57.19006 Section 57.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 57.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  7. 30 CFR 56.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 56.19006 Section 56.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 56.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  8. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  9. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  10. Etude des erreurs d'estimation des populations par la méthode des captures successives (DeLURY, 2 captures et des captures-recaptures (PETERSEN

    Directory of Open Access Journals (Sweden)

    LAURENT M.

    1978-01-01

    Full Text Available L'estimation des populations naturelles par capture-recapture et par captures successives est souvent entachée d'erreur car, dans de nombreux cas, l'hypothèse fondamentale d'égalité des probabilités de captures pour tous les individus dans le temps et dans l'espace n'est pas respectée. Dans le cas des populations de poissons envisagés ici, les captures ont lieu par la pêche électrique. On a pu chiffrer l'ordre de grandeur des erreurs systématiques faites sur l'estimation des peuplements, en fonction des conditions particulières, biotiques et abiotiques, des différents milieux inventoriés.

  11. Automatic CDR Estimation for Early Glaucoma Diagnosis

    Science.gov (United States)

    Sarmiento, A.; Sanchez-Morillo, D.; Jiménez, S.; Alemany, P.

    2017-01-01

    Glaucoma is a degenerative disease that constitutes the second cause of blindness in developed countries. Although it cannot be cured, its progression can be prevented through early diagnosis. In this paper, we propose a new algorithm for automatic glaucoma diagnosis based on retinal colour images. We focus on capturing the inherent colour changes of optic disc (OD) and cup borders by computing several colour derivatives in CIE L∗a∗b∗ colour space with CIE94 colour distance. In addition, we consider spatial information retaining these colour derivatives and the original CIE L∗a∗b∗ values of the pixel and adding other characteristics such as its distance to the OD centre. The proposed strategy is robust due to a simple structure that does not need neither initial segmentation nor removal of the vascular tree or detection of vessel bends. The method has been extensively validated with two datasets (one public and one private), each one comprising 60 images of high variability of appearances. Achieved class-wise-averaged accuracy of 95.02% and 81.19% demonstrates that this automated approach could support physicians in the diagnosis of glaucoma in its early stage, and therefore, it could be seen as an opportunity for developing low-cost solutions for mass screening programs. PMID:29279773

  12. Identification of mycobacterium tuberculosis in sputum smear slide using automatic scanning microscope

    Science.gov (United States)

    Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri

    2015-04-01

    Sputum smear observation has an important role in tuberculosis (TB) disease diagnosis, because it needs accurate identification to avoid high errors diagnosis. In development countries, sputum smear slide observation is commonly done with conventional light microscope from Ziehl-Neelsen stained tissue and it doesn't need high cost to maintain the microscope. The clinicians do manual screening process for sputum smear slide which is time consuming and needs highly training to detect the presence of TB bacilli (mycobacterium tuberculosis) accurately, especially for negative slide and slide with less number of TB bacilli. For helping the clinicians, we propose automatic scanning microscope with automatic identification of TB bacilli. The designed system modified the field movement of light microscope with stepper motor which was controlled by microcontroller. Every sputum smear field was captured by camera. After that some image processing techniques were done for the sputum smear images. The color threshold was used for background subtraction with hue canal in HSV color space. Sobel edge detection algorithm was used for TB bacilli image segmentation. We used feature extraction based on shape for bacilli analyzing and then neural network classified TB bacilli or not. The results indicated identification of TB bacilli that we have done worked well and detected TB bacilli accurately in sputum smear slide with normal staining, but not worked well in over staining and less staining tissue slide. However, overall the designed system can help the clinicians in sputum smear observation becomes more easily.

  13. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    Science.gov (United States)

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  14. Adiabatic capture and debunching

    International Nuclear Information System (INIS)

    Ng, K.Y.

    2012-01-01

    In the study of beam preparation for the g-2 experiment, adiabatic debunching and adiabatic capture are revisited. The voltage programs for these adiabbatic processes are derived and their properties discussed. Comparison is made with some other form of adiabatic capture program. The muon g-2 experiment at Fermilab calls for intense proton bunches for the creation of muons. A booster batch of 84 bunches is injected into the Recycler Ring, where it is debunched and captured into 4 intense bunches with the 2.5-MHz rf. The experiment requires short bunches with total width less than 100 ns. The transport line from the Recycler to the muon-production target has a low momentum aperture of ∼ ±22 MeV. Thus each of the 4 intense proton bunches required to have an emittance less than ∼ 3.46 eVs. The incoming booster bunches have total emittance ∼ 8.4 eVs, or each one with an emittance ∼ 0.1 eVs. However, there is always emittance increase when the 84 booster bunches are debunched. There will be even larger emittance increase during adiabatic capture into the buckets of the 2.5-MHz rf. In addition, the incoming booster bunches may have emittances larger than 0.1 eVs. In this article, we will concentrate on the analysis of the adiabatic capture process with the intention of preserving the beam emittance as much as possible. At this moment, beam preparation experiment is being performed at the Main Injector. Since the Main Injector and the Recycler Ring have roughly the same lattice properties, we are referring to adiabatic capture in the Main Injector instead in our discussions.

  15. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  16. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  17. Password Management Systems

    OpenAIRE

    Fiala, Lukáš

    2010-01-01

    The goal is to describe and compare password management utilities and applications for individuals or small teams. Examples of such applications are KeePass, Password Safe, CodeWallet Pro and eWallet. On-line password managers like LastPass or RoboForm are another group of applications. Support for synchronization, sharing passwords in a team or protection against keyloggers (when filling in automatically) may also be included in the criteria.

  18. Design of patient rooms and automatic radioiodine-131 waste water management system for a thyroid cancer treatment ward: ‘Suandok model’

    International Nuclear Information System (INIS)

    Vilasdechanon, N; Ua-apisitwong, S; Chatnampet, K; Ekmahachai, M; Vilasdechanon, J

    2014-01-01

    The great benefit of 131 I radionuclide treatment for differentiated thyroid cancer (DTC) was acknowledged by the long survival rate. The main requirements for 131 I therapy in hospital were treatment facilities and a radiation safety plan that assured radiation protection and safety to patient, hospital worker, public, and environment. Objective: To introduce the concepts and methods of radiation safety design for a patient’s room in a 131 I treatment ward and a system of radioactive waste water management in hospital. Methods: The design was based on principles of external and internal radiation protection for unsealed source and radioactive waste management. Planning for treatment facilities was concluded from clinical evidence, physical and physiological information for 131 I, radiation safety criteria, hospital resources and budget. The three phases of the working process were: construction, software development, and radiation safety assessment. Results: The 131 I treatment facility and automatic radioactive waste water management system was completely implemented in 2009. The radiation waste water management system known as the ‘Suandok Model’ was highly recommended by the national regulator to hospitals who desire to provide 131 I treatment for thyroid cancer. In 2011, the Nuclear Medicine Division, Chiang Mai University was rewarded by the national authority for a very good radiation practice in development of safe working conditions and environment. Conclusion: The Suandok Model was a facility design that fulfilled requirements for the safe use of high radiation 131 I doses for thyroid cancer treatment in hospital. The facility presented in this study may not be suitable for all hospitals but the design concepts could be applied according to an individual hospital context and resources. People who use or gain benefit from radiation applications have to emphasise the responsibility to control and monitor radiation effects on individuals, communities

  19. The Generic Data Capture Facility

    Science.gov (United States)

    Connell, Edward B.; Barnes, William P.; Stallings, William H.

    1987-01-01

    The Generic Data Capture Facility, which can provide data capture support for a variety of different types of spacecraft while enabling operations costs to be carefully controlled, is discussed. The data capture functions, data protection, isolation of users from data acquisition problems, data reconstruction, and quality and accounting are addressed. The TDM and packet data formats utilized by the system are described, and the development of generic facilities is considered.

  20. Negative meson capture in hydrogen

    International Nuclear Information System (INIS)

    Baird, T.J.

    1977-01-01

    The processes of deexcitation and capture of negative mesons and hadrons in atomic hydrogen are investigated. Only slow collisions in which the projectile-atom relative velocity is less than one atomic unit are considered, and the motion of the incident particle is treated classically. For each classical trajectory the probability of ionizing the hydrogen atom is determined, together with the energy spectrum of the emitted electron. Ionization probabilities are calculated using the time-dependent formulation of the perturbed stationary state method. Exact two-center electronic wave functions are used for both bound and continuum states. The total ionization cross section and electron energy spectrum have been calculated for negative muons, kaons and antiprotons at incident relative velocities between 0.04 and 1.0 atomic units. The electron energy spectrum has a sharp peak for electron kinetic energies on the order of 10 -3 Rydbergs. The ionization process thus favors the emission of very slow electrons. The cross section for ionization with capture of the incident particle was calculated for relative kinetic energies greater than 1.0 Rydberg. Since ionization was found to occur with the emission of electrons of nearly zero kinetic energy, the fraction of ionizing collisions which result in capture decreases very rapidly with projectile kinetic energy. The energy distributions of slowed down muons and hadrons were also computed. These distributions were used together with the capture cross section to determine the distribution of kinetic energies at which capture takes place. It was found that most captures occur for kinetic energies slightly less than 1.0 Rydbergs with relatively little capture at thermal energies. The captured particles therefore tend to go into very large and loosely found orbits with binding energies less than 0.1 Rydbergs

  1. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  2. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  3. CAPTURED India Country Evaluation

    NARCIS (Netherlands)

    O'Donoghue, R.; Brouwers, J.H.A.M.

    2012-01-01

    This report provides the findings of the India Country Evaluation and is produced as part of the overall CAPTURED End Evaluation. After five years of support by the CAPTURED project the End Evaluation has assessed that results are commendable. I-AIM was able to design an approach in which health

  4. Carbon Capture and Storage

    NARCIS (Netherlands)

    Benson, S.M.; Bennaceur, K.; Cook, P.; Davison, J.; Coninck, H. de; Farhat, K.; Ramirez, C.A.; Simbeck, D.; Surles, T.; Verma, P.; Wright, I.

    2012-01-01

    Emissions of carbon dioxide, the most important long-lived anthropogenic greenhouse gas, can be reduced by Carbon Capture and Storage (CCS). CCS involves the integration of four elements: CO 2 capture, compression of the CO2 from a gas to a liquid or a denser gas, transportation of pressurized CO 2

  5. Giant resonance effects in radiative capture

    International Nuclear Information System (INIS)

    Snover, K.A.

    1979-01-01

    The technique of capture reaction studies of giant resonance properties is described, and a number of examples are given. Most of the recent work of interest has been in proton capture, in part because of the great utility (and availability) of polarized beams; most of the discussion concerns this reaction. Alpha capture, which has been a useful tool for exploring isoscalar E2 strength, and neutron capture are, however, also treated. 46 references, 14 figures

  6. Learning-based automatic detection of severe coronary stenoses in CT angiographies

    Science.gov (United States)

    Melki, Imen; Cardon, Cyril; Gogin, Nicolas; Talbot, Hugues; Najman, Laurent

    2014-03-01

    3D cardiac computed tomography angiography (CCTA) is becoming a standard routine for non-invasive heart diseases diagnosis. Thanks to its high negative predictive value, CCTA is increasingly used to decide whether or not the patient should be considered for invasive angiography. However, an accurate assessment of cardiac lesions using this modality is still a time consuming task and needs a high degree of clinical expertise. Thus, providing automatic tool to assist clinicians during the diagnosis task is highly desirable. In this work, we propose a fully automatic approach for accurate severe cardiac stenoses detection. Our algorithm uses the Random Forest classi cation to detect stenotic areas. First, the classi er is trained on 18 CT cardiac exams with CTA reference standard. Then, then classi cation result is used to detect severe stenoses (with a narrowing degree higher than 50%) in a 30 cardiac CT exam database. Features that best captures the di erent stenoses con guration are extracted along the vessel centerlines at di erent scales. To ensure the accuracy against the vessel direction and scale changes, we extract features inside cylindrical patterns with variable directions and radii. Thus, we make sure that the ROIs contains only the vessel walls. The algorithm is evaluated using the Rotterdam Coronary Artery Stenoses Detection and Quantication Evaluation Framework. The evaluation is performed using reference standard quanti cations obtained from quantitative coronary angiography (QCA) and consensus reading of CTA. The obtained results show that we can reliably detect severe stenosis with a sensitivity of 64%.

  7. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  8. Motion Capturing Emotions

    OpenAIRE

    Wood Karen; Cisneros Rosemary E.; Whatley Sarah

    2017-01-01

    The paper explores the activities conducted as part of WhoLoDancE: Whole Body Interaction Learning for Dance Education which is an EU-funded Horizon 2020 project. In particular, we discuss the motion capture sessions that took place at Motek, Amsterdam as well as the dancers’ experience of being captured and watching themselves or others as varying visual representations through the HoloLens. HoloLens is Microsoft’s first holographic computer that you wear as you would a pair of glasses. The ...

  9. Development project of an automatic sampling system for part time unmanned pipeline terminals

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Gullherme O.; De Almelda, Marcio M. G.; Ramos, Ricardo R. [Petrobas, (Brazil); Potten, Gary [Cameron Measurement Systems, (United States)

    2010-07-01

    The Sao Paulo - Brasilia Pipeline (OSBRA) is a highly automated pipeline using a SCADA system which operates from a control room. A new quality management system standard was established for transportation and storage operations. The products had to be sampled on an automatic basis. This paper reports the development of an automatic sampling system (ASS) in accordance with the new quality control standard. The prototype was developed to be implemented through a human-machine interface (HMI) from the control room SCADA screens. A technical cooperation agreement(TCA) was drawn up for development of this new ASS product. The TCA was a joint cooperation between the Holding, the Operator and the cooperators. The prototype will be on-field tested at Senador Canedo tank farm to SPEC requirements. The current performance of the ASS establishes reasonable expectations for further successful development.

  10. Capture and Geological Storage of CO2

    International Nuclear Information System (INIS)

    Kerr, T.; Brockett, S.; Hegan, L.; Barbucci, P.; Tullius, K.; Scott, J.; Otter, N.; Cook, P.; Hill, G.; Dino, R.; Aimard, N.; Giese, R.; Christensen, N.P.; Munier, G.; Paelinck, Ph.; Rayna, L.; Stromberg, L.; Birat, J.P.; Audigane, P.; Loizzo, M.; Arts, R.; Fabriol, H.; Radgen, P.; Hartwell, J.; Wartmann, S.; Drosin, E.; Willnow, K.; Moisan, F.

    2009-01-01

    To build on the growing success of the first two international symposia on emission reduction and CO 2 capture and geological storage, held in Paris in 2005 and again in 2007, IFP, ADEME and BRGM organised a third event on the same topic the 5-6 November 2009. This time, the focus was on the urgency of industrial deployment. Indeed, the IPCC 4. assessment report indicates that the world must achieve a 50 to 85% reduction in CO 2 emissions by 2050 compared to 2000, in order to limit the global temperature increase to around 2 deg. C. Moreover, IPCC stresses that a 'business as usual' scenario could lead to a temperature increase of between 4 deg. C to 7 deg. C across the planet. The symposium was organized in 4 sessions: Session I - Regulatory framework and strategies for enabling CCS deployment: - CCS: international status of political, regulatory and financing issues (Tom Kerr, IEA); - EC regulatory framework (Scott Brockett, European Commission, DG ENV); - Canada's investments towards implementation of CCS in Canada (Larry Hegan, Office of Energy Research and Development - Government of Canada); - A power company perspective (Pietro Barbucci, ENEL); - EC CCS demonstration network (Kai Tullius, European Commission, DG TREN); - Strategies and policies for accelerating global CCS deployment (Jesse Scott, E3G); - The global CCS Institute, a major initiative to facilitate the rapid deployment of CCS (Nick Otter, GCCSI); Session II - From pilot to demonstration projects: - Otway project, Australia (David Hilditch, CO2 CRC); - US regional partnerships (Gerald Hill, Southeast Regional Carbon Sequestration Partnership - SECARB); - CCS activities in Brazil (Rodolfo Dino, Petrobras); - Lessons learnt from Ketzin CO2Sink project in Germany (Ruediger Giese, GFZ); - CO 2 storage - from laboratory to reality (Niels-Peter Christensen, Vattenfall); - Valuation and storage of CO 2 : A global project for carbon management in South-East France (Gilles Munier, Geogreen); Session III

  11. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  12. Trigger for the next Industrial Revolution Automatic Operations Management

    OpenAIRE

    Marathe , Laxman ,

    2016-01-01

    International audience; A technology that manages operations at micro level with full future visibility will be the next big change. Production cost will be greatly reduced as cost of coordination is eliminated. Instead of selling products, manufacturers may hire their facilities to several customers on a time-shared basis allowing them to manage what, how much and when to produce what they desire by only paying for the actual time resources were used. Laxman C. Marathe Researcher (Factory Ph...

  13. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  14. Exploratory field trial of motorcycle autonomous emergency braking (MAEB): Considerations on the acceptability of unexpected automatic decelerations.

    Science.gov (United States)

    Savino, Giovanni; Pierini, Marco; Thompson, Jason; Fitzharris, Michael; Lenné, Michael G

    2016-11-16

    Autonomous emergency braking (AEB) acts to slow down a vehicle when an unavoidable impending collision is detected. In addition to documented benefits when applied to passenger cars, AEB has also shown potential when applied to motorcycles (MAEB). However, the feasibility of MAEB as practically applied to motorcycles in the real world is not well understood. In this study we performed a field trial involving 16 riders on a test motorcycle subjected to automatic decelerations, thus simulating MAEB activation. The tests were conducted along a rectilinear path at nominal speed of 40 km/h and with mean deceleration of 0.15 g (15% of full braking) deployed at random times. Riders were also exposed to one final undeclared brake activation with the aim of providing genuinely unexpected automatic braking events. Participants were consistently able to manage automatic decelerations of the vehicle with minor to moderate effort. Results of undeclared activations were consistent with those of standard runs. This study demonstrated the feasibility of a moderate automatic deceleration in a scenario of motorcycle travelling in a straight path, supporting the notion that the application of AEB on motorcycles is practicable. Furthermore, the proposed field trial can be used as a reference for future regulation or consumer tests in order to address safety and acceptability of unexpected automatic decelerations on a motorcycle.

  15. A survey of the Carbon Capture

    International Nuclear Information System (INIS)

    Jokrllova, J.; Cik, G.; Takacova, A.; Smolinska, M.

    2014-01-01

    The concentration of carbon dioxide, one of the most important representatives of greenhouse gases in the atmosphere continues to rise. Fossil fuels burned in thermal power plants currently represent 80% of total energy production around the world and are the largest point sources of CO 2 , accounting for approximately 40% of total CO 2 emissions. There are several options for reducing CO 2 emissions: reducing demand, improving production efficiency and carbon capture and storage (CCS, carbon capture and storage). Capture and storage of carbon dioxide is generally a three-step process: 1 st Capture and compression of combustion products, 2 nd transport (mostly pipeline) and 3 rd utilization (eg. production of urea, beverage industry, production of dry ice, etc.). Technologies for CO 2 capturing used in power plants burning fossil fuels can be divided into four groups, each of which requires a completely different approach to CO 2 capture.

  16. Using Modern Technologies to Capture and Share Indigenous Astronomical Knowledge

    Science.gov (United States)

    Nakata, Martin; Hamacher, Duane W.; Warren, John; Byrne, Alex; Pagnucco, Maurice; Harley, Ross; Venugopal, Srikumar; Thorpe, Kirsten; Neville, Richard; Bolt, Reuben

    2014-06-01

    Indigenous Knowledge is important for Indigenous communities across the globe and for the advancement of our general scientific knowledge. In particular, Indigenous astronomical knowledge integrates many aspects of Indigenous Knowledge, including seasonal calendars, navigation, food economics, law, ceremony, and social structure. Capturing, managing, and disseminating this knowledge in the digital environment poses a number of challenges, which we aim to address using a collaborative project emerging between experts in the higher education, library, archive and industry sectors. Using Microsoft's WorldWide Telescope and Rich Interactive Narratives technologies, we propose to develop software, media design, and archival management solutions to allow Indigenous communities to share their astronomical knowledge with the world on their terms and in a culturally sensitive manner.

  17. Continuum capture in the three-body problem

    International Nuclear Information System (INIS)

    Sellin, I.A.

    1980-01-01

    The three-body problem, especially the problem of electron capture to the continuum in heavy particle collisions is reviewed. Major topics covered include: second born-induced asymmetry in electron capture to the continuum; historical context, links to other tests of atomic scattering theory; experiments characterizing the velocity distribution of ECC electrons; other atomic physics tests of high velocity Born expansions; atom capture; capture by positrons; and pion capture to the continuum

  18. Knowledge management: another management fad?

    Directory of Open Access Journals (Sweden)

    Leonard J. Ponzi

    2002-01-01

    Full Text Available Knowledge management is a subject of a growth body of literature. While capturing the interest of practitioners and scholars in the mid-1990s, knowledge management remains a broadly defined concept with faddish characteristics. Based on annual counts of article retrieved from Science Citation Index, Social Science Citation Index, and ABI Inform referring to three previous recognized management fad, this paper introduces empirical evidence that proposes that a typical management movement generally reveals itself as a fad in approximately five years. In applying this approach and assumption to the case of knowledge management, the findings suggest that knowledge management is at least living longer than typical fads and perhaps is in the process of establishing itself as a new aspect of management. To further the understanding of knowledge management's development, its interdisciplinary activity and breadth are reported and briefly discussed.

  19. Speed and automaticity of word recognition - inseparable twins?

    DEFF Research Database (Denmark)

    Poulsen, Mads; Asmussen, Vibeke; Elbro, Carsten

    'Speed and automaticity' of word recognition is a standard collocation. However, it is not clear whether speed and automaticity (i.e., effortlessness) make independent contributions to reading comprehension. In theory, both speed and automaticity may save cognitive resources for comprehension...... processes. Hence, the aim of the present study was to assess the unique contributions of word recognition speed and automaticity to reading comprehension while controlling for decoding speed and accuracy. Method: 139 Grade 5 students completed tests of reading comprehension and computer-based tests of speed...... of decoding and word recognition together with a test of effortlessness (automaticity) of word recognition. Effortlessness was measured in a dual task in which participants were presented with a word enclosed in an unrelated figure. The task was to read the word and decide whether the figure was a triangle...

  20. Real-Time Human Detection for Aerial Captured Video Sequences via Deep Models

    Directory of Open Access Journals (Sweden)

    Nouar AlDahoul

    2018-01-01

    Full Text Available Human detection in videos plays an important role in various real life applications. Most of traditional approaches depend on utilizing handcrafted features which are problem-dependent and optimal for specific tasks. Moreover, they are highly susceptible to dynamical events such as illumination changes, camera jitter, and variations in object sizes. On the other hand, the proposed feature learning approaches are cheaper and easier because highly abstract and discriminative features can be produced automatically without the need of expert knowledge. In this paper, we utilize automatic feature learning methods which combine optical flow and three different deep models (i.e., supervised convolutional neural network (S-CNN, pretrained CNN feature extractor, and hierarchical extreme learning machine for human detection in videos captured using a nonstatic camera on an aerial platform with varying altitudes. The models are trained and tested on the publicly available and highly challenging UCF-ARG aerial dataset. The comparison between these models in terms of training, testing accuracy, and learning speed is analyzed. The performance evaluation considers five human actions (digging, waving, throwing, walking, and running. Experimental results demonstrated that the proposed methods are successful for human detection task. Pretrained CNN produces an average accuracy of 98.09%. S-CNN produces an average accuracy of 95.6% with soft-max and 91.7% with Support Vector Machines (SVM. H-ELM has an average accuracy of 95.9%. Using a normal Central Processing Unit (CPU, H-ELM’s training time takes 445 seconds. Learning in S-CNN takes 770 seconds with a high performance Graphical Processing Unit (GPU.

  1. Facilitate generation connections on Orkney by automatic distribution network management

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of a study assessing the capability and limitations of the Orkney Network under a variety of conditions of demand, generation connections, network configuration, and reactive compensation). A conceptual active management scheme (AMS) suitable for the conditions on Orkney is developed and evaluated. Details are given of a proposed framework for the design and evaluation of future active management schemes, logic control sequences for managed generation units, and a proposed evaluation method for the active management scheme. Implications of introducing the proposed AMS are examined, and the commercial aspects of an AMS and system security are considered. The existing Orkney network is described; and an overview of the SHEPDL (Scottish Hydro Electric Power Distribution Ltd.) SCADA system is presented with a discussion of AMS identification, selection, and development.

  2. AGR fuel management using PANTHER

    International Nuclear Information System (INIS)

    Haddock, S.A.; Parks, G.T.

    1995-01-01

    This paper describes recent improvements in the AGR fuel management methodology implemented within PANTHER and the use of the code both for stand-alone calculations and within an automatic optimisation procedure. (author)

  3. Capture of irregular satellites at Jupiter

    International Nuclear Information System (INIS)

    Nesvorný, David; Vokrouhlický, David; Deienno, Rogerio

    2014-01-01

    The irregular satellites of outer planets are thought to have been captured from heliocentric orbits. The exact nature of the capture process, however, remains uncertain. We examine the possibility that irregular satellites were captured from the planetesimal disk during the early solar system instability when encounters between the outer planets occurred. Nesvorný et al. already showed that the irregular satellites of Saturn, Uranus, and Neptune were plausibly captured during planetary encounters. Here we find that the current instability models present favorable conditions for capture of irregular satellites at Jupiter as well, mainly because Jupiter undergoes a phase of close encounters with an ice giant. We show that the orbital distribution of bodies captured during planetary encounters provides a good match to the observed distribution of irregular satellites at Jupiter. The capture efficiency for each particle in the original transplanetary disk is found to be (1.3-3.6) × 10 –8 . This is roughly enough to explain the observed population of jovian irregular moons. We also confirm Nesvorný et al.'s results for the irregular satellites of Saturn, Uranus, and Neptune.

  4. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  5. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  6. CO2 Capture and Reuse

    International Nuclear Information System (INIS)

    Thambimuthu, K.; Gupta, M.; Davison, J.

    2003-01-01

    CO2 capture and storage including its utilization or reuse presents an opportunity to achieve deep reductions in greenhouse gas emissions from fossil energy use. The development and deployment of this option could significantly assist in meeting a future goal of achieving stabilization of the presently rising atmospheric concentration of greenhouse gases. CO2 capture from process streams is an established concept that has achieved industrial practice. Examples of current applications include the use of primarily, solvent based capture technologies for the recovery of pure CO2 streams for chemical synthesis, for utilization as a food additive, for use as a miscible agent in enhanced oil recovery operations and removal of CO2 as an undesired contaminant from gaseous process streams for the production of fuel gases such as hydrogen and methane. In these applications, the technologies deployed for CO2 capture have focused on gas separation from high purity, high pressure streams and in reducing (or oxygen deficient) environments, where the energy penalties and cost for capture are moderately low. However, application of the same capture technologies for large scale abatement of greenhouse gas emissions from fossil fuel use poses significant challenges in achieving (at comparably low energy penalty and cost) gas separation in large volume, dilute concentration and/or low pressure flue gas streams. This paper will focus on a review of existing commercial methods of CO2 capture and the technology stretch, process integration and energy system pathways needed for their large scale deployment in fossil fueled processes. The assessment of potential capture technologies for the latter purpose will also be based on published literature data that are both 'transparent' and 'systematic' in their evaluation of the overall cost and energy penalties of CO2 capture. In view of the of the fact that many of the existing commercial processes for CO2 capture have seen applications in

  7. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  8. Comparing urban and wildland bear densities with a DNA-based capture-mark-recapture approach

    OpenAIRE

    Fusaro, Jonathan L.; Conner, Mary M.; Conover, Michael R.; Taylor, Timothy J.; Kenyon, Marc W., Jr.; Sherman, Jamie R.; Ernest, Holly B.

    2017-01-01

    California’s black bear (Ursus americanus) population has tripled over the last 3 decades, causing an increased incidence of human–bear conflicts, many of which now occur in urban areas. Consequently, it is imperative that bear managers have the ability to monitor population parameters in both wildland and urban environments to help manage bears. Capture-mark-recapture (CMR) methods using uniquely typed genetic samples (DNA) collected via hair-snares have been widely used to monitor bears in ...

  9. Confusion noise from LISA capture sources

    International Nuclear Information System (INIS)

    Barack, Leor; Cutler, Curt

    2004-01-01

    Captures of compact objects (COs) by massive black holes (MBHs) in galactic nuclei will be an important source for LISA, the proposed space-based gravitational wave (GW) detector. However, a large fraction of captures will not be individually resolvable - either because they are too distant, have unfavorable orientation, or have too many years to go before final plunge - and so will constitute a source of 'confusion noise', obscuring other types of sources. In this paper we estimate the shape and overall magnitude of the GW background energy spectrum generated by CO captures. This energy spectrum immediately translates to a spectral density S h capt (f) for the amplitude of capture-generated GWs registered by LISA. The overall magnitude of S h capt (f) is linear in the CO capture rates, which are rather uncertain; therefore we present results for a plausible range of rates. S h capt (f) includes the contributions from both resolvable and unresolvable captures, and thus represents an upper limit on the confusion noise level. We then estimate what fraction of S h capt (f) is due to unresolvable sources and hence constitutes confusion noise. We find that almost all of the contribution to S h capt (f) coming from white dwarf and neutron star captures, and at least ∼30% of the contribution from black hole captures, is from sources that cannot be individually resolved. Nevertheless, we show that the impact of capture confusion noise on the total LISA noise curve ranges from insignificant to modest, depending on the rates. Capture rates at the high end of estimated ranges would raise LISA's overall (effective) noise level [fS h eff (f)] 1/2 by at most a factor ∼2 in the frequency range 1-10 mHz, where LISA is most sensitive. While this slightly elevated noise level would somewhat decrease LISA's sensitivity to other classes of sources, we argue that, overall, this would be a pleasant problem for LISA to have: It would also imply that detection rates for CO captures

  10. Automatic Lamp and Fan Control Based on Microcontroller

    Science.gov (United States)

    Widyaningrum, V. T.; Pramudita, Y. D.

    2018-01-01

    In general, automation can be described as a process following pre-determined sequential steps with a little or without any human exertion. Automation is provided with the use of various sensors suitable to observe the production processes, actuators and different techniques and devices. In this research, the automation system developed is an automatic lamp and an automatic fan on the smart home. Both of these systems will be processed using an Arduino Mega 2560 microcontroller. A microcontroller is used to obtain values of physical conditions through sensors connected to it. In the automatic lamp system required sensors to detect the light of the LDR (Light Dependent Resistor) sensor. While the automatic fan system required sensors to detect the temperature of the DHT11 sensor. In tests that have been done lamps and fans can work properly. The lamp can turn on automatically when the light begins to darken, and the lamp can also turn off automatically when the light begins to bright again. In addition, it can concluded also that the readings of LDR sensors are placed outside the room is different from the readings of LDR sensors placed in the room. This is because the light intensity received by the existing LDR sensor in the room is blocked by the wall of the house or by other objects. Then for the fan, it can also turn on automatically when the temperature is greater than 25°C, and the fan speed can also be adjusted. The fan may also turn off automatically when the temperature is less than equal to 25°C.

  11. Carrier introduction to moire pattern for automatic fringe-order distinguishing

    International Nuclear Information System (INIS)

    Fang, J.; Laermann, K.H.

    1992-01-01

    This paper presents an automatic procedure of pseudo-colour encoding of moire fringe orders. A carrier consisting of parallel fringes is introduced before the specimen deforms. The carrier pattern is captured by a camera and then stored in computer as a standard image. The space of the carrier fringes is distored by the strains on the specimen as it is loaded. On a certain condition, the orders of the frequency-modulated carrier still vary monotonically so that they can be easyly distinguished. Both the standard fringe-carrier and the frequency-modulated fringe pattern are transformed into two digital images, of which every fringe is encoded by one of the pseudo-colour codes corresponding to the monotonical fringe orders. At each pixel, the difference between the colour sequences of two images is calculated to obtain the fringe order of pure deformation. The moire pattern of the in-plane displacement is restored as a pseudo-colour image by whose colour-change the variation of the fringe orders is displayed. (orig.)

  12. Nuclear muon capture

    CERN Document Server

    Mukhopadhyay, N C

    1977-01-01

    Our present knowledge of the nuclear muon capture reactions is surveyed. Starting from the formation of the muonic atom, various phenomena, having a bearing on the nuclear capture, are reviewed. The nuclear reactions are then studied from two angles-to learn about the basic muon+nucleon weak interaction process, and to obtain new insights on the nuclear dynamics. Future experimental prospects with the newer generation muon 'factories' are critically examined. Possible modification of the muon+nucleon weak interaction in complex nuclei remains the most important open problem in this field. (380 refs).

  13. Automatic control system at the ''Loviisa'' NPP

    International Nuclear Information System (INIS)

    Kukhtevich, I.V.; Mal'tsev, B.K.; Sergievskaya, E.N.

    1980-01-01

    Automatic control system of the Loviisa-1 NPP (Finland) is described. According to operation conditions of Finland power system the Loviisa-1 NPP must operate in the mode of week and day control of loading schedule and participate in current control of power system frequency and capacity. With provision for these requirements NPP is equipped with the all-regime system for automatic control functioning during reactor start-up, shut-down, in normal and transient regimes and in emergency situations. The automatic control system includes: a data subsystem, an automatic control subsystem, a discrete control subsystem including remote, a subsystem for reactor control and protection and overall station system of protections: control and dosimetry inside the reactor. Structures of a data-computer complex, discrete control subsystems, reactor control and protection systems, neutron flux control system, inside-reactor control system, station protection system and system for control of fuel element tightness are presented in short. Two-year experience of the NPP operation confirmed advisability of the chosen volume of automatization. The Loviisa-1 NPP operates successfully in the mode of the week and day control of supervisor schedule and current control of frequency (short-term control)

  14. Management of radioactive waste gases from PET radiopharmaceutical synthesis using cost effective capture systems integrated with a cyclotron safety system.

    Science.gov (United States)

    Stimson, D H R; Pringle, A J; Maillet, D; King, A R; Nevin, S T; Venkatachalam, T K; Reutens, D C; Bhalla, R

    2016-09-01

    The emphasis on the reduction of gaseous radioactive effluent associated with PET radiochemistry laboratories has increased. Various radioactive gas capture strategies have been employed historically including expensive automated compression systems. We have implemented a new cost-effective strategy employing gas capture bags with electronic feedback that are integrated with the cyclotron safety system. Our strategy is suitable for multiple automated 18 F radiosynthesis modules and individual automated 11 C radiosynthesis modules. We describe novel gas capture systems that minimize the risk of human error and are routinely used in our facility.

  15. Managing electronic records

    CERN Document Server

    McLeod, Julie

    2005-01-01

    For records management courses, this book covers the theory and practice of managing electronic records as business and information assets. It focuses on the strategies, systems and procedures necessary to ensure that electronic records are appropriately created, captured, organized and retained over time to meet business and legal requirements.

  16. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  17. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Science.gov (United States)

    2010-04-01

    ... SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Equipment § 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Automatic, mechanical, and electronic equipment...

  18. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  19. Configuration management and automatic control of an augmentor wing aircraft with vectored thrust

    Science.gov (United States)

    Cicolani, L. S.; Sridhar, B.; Meyer, G.

    1979-01-01

    An advanced structure for automatic flight control logic for powered-lift aircraft operating in terminal areas is under investigation at Ames Research Center. This structure is based on acceleration control; acceleration commands are constructed as the sum of acceleration on the reference trajectory and a corrective feedback acceleration to regulate path tracking errors. The central element of the structure, termed a Trimmap, uses a model of the aircraft aerodynamic and engine forces to calculate the control settings required to generate the acceleration commands. This report describes the design criteria for the Trimmap and derives a Trimmap for Ames experimental augmentor wing jet STOL research aircraft.

  20. Evaluation of automatic face recognition for automatic border control on actual data recorded of travellers at Schiphol Airport

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Hendrikse, A.J.; Gerritsen, K.J.; Brömme, A.; Busch, C.

    2012-01-01

    Automatic border control at airports using automated facial recognition for checking the passport is becoming more and more common. A problem is that it is not clear how reliable these automatic gates are. Very few independent studies exist that assess the reliability of automated facial recognition

  1. Nucleic acid hybridization assays employing dA-tailed capture probes. II. Advanced multiple capture methods

    International Nuclear Information System (INIS)

    Hunsaker, W.R.; Badri, H.; Lombardo, M.; Collins, M.L.

    1989-01-01

    A fourth capture is added to the reversible target capture procedure. This results in an improved radioisotopic detection limit of 7.3 x 10(-21) mol of target. In addition, the standard triple capture method is converted into a nonradioactive format with a detection limit of under 1 amol of target. The principal advantage of nonradioactive detection is that the entire assay can be performed in about 1 h. Nucleic acids are released from cells in the presence of the (capture probe) which contains a 3'-poly(dA) sequence and the (labeled probe) which contains a detectable nonradioactive moiety such as biotin. After a brief hybridization in solution, the target is captured on oligo(dT) magnetic particles. The target is further purified from sample impurities and excess labeled probe by recapture either once or twice more on fresh magnetic particles. The highly purified target is then concentrated to 200 nl by recapture onto a poly(dT) nitrocellulose filter and rapidly detected with streptavidin-alkaline phosphatase using bromochloroindolyl phosphate and nitroblue tetrazolium. Using this procedure, as little as 0.25 amol of a target plasmid has been detected nonradioactively in crude samples in just 1 h without prior purification of the DNA and RNA. Finally, a new procedure called background capture is introduced to complement the background-reducing power of RTC

  2. Report on achievements of research and development of an automatic sewing system in fiscal 1985. System management and control technology; 1985 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. System kanri seigyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-03-01

    This paper describes the system management and control technology, extracted from the achievement report for fiscal 1985 on developing an automatic sewing system. The comprehensive system management has verified algorithms for the optimal process formation and load balancing based on the industry's latest market and production plan trends, and obtained good results. Furthermore, structuring, discussions and proposals were made on an interface system of an optimal process controlling computer with an automatic device controlling micro computer. The inspection and failure diagnosis have set a quality standard for intermediate products in the sewing lines, and verified that its automation is possible by using an image processing technology that identifies dimensional inspection and colors. In the research of printing control information, calculations were performed on the information amount, printing locations, expression method and printing speed, narrowing down to parts ID and location information as the required printing information. Thus,items of basic information for media, media printing and printing device design were put into order. The information recognizing technology has picked up image recognition applying locations in a sewing factory and problems therein, and performed evaluation experiments on specific pattern matching and processing position recognition to have obtained the prospect of application of the automation. (NEDO)

  3. Report on achievements of research and development of an automatic sewing system in fiscal 1985. System management and control technology; 1985 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. System kanri seigyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-03-01

    This paper describes the system management and control technology, extracted from the achievement report for fiscal 1985 on developing an automatic sewing system. The comprehensive system management has verified algorithms for the optimal process formation and load balancing based on the industry's latest market and production plan trends, and obtained good results. Furthermore, structuring, discussions and proposals were made on an interface system of an optimal process controlling computer with an automatic device controlling micro computer. The inspection and failure diagnosis have set a quality standard for intermediate products in the sewing lines, and verified that its automation is possible by using an image processing technology that identifies dimensional inspection and colors. In the research of printing control information, calculations were performed on the information amount, printing locations, expression method and printing speed, narrowing down to parts ID and location information as the required printing information. Thus,items of basic information for media, media printing and printing device design were put into order. The information recognizing technology has picked up image recognition applying locations in a sewing factory and problems therein, and performed evaluation experiments on specific pattern matching and processing position recognition to have obtained the prospect of application of the automation. (NEDO)

  4. Automatic allograft bone selection through band registration and its application to distal femur.

    Science.gov (United States)

    Zhang, Yu; Qiu, Lei; Li, Fengzan; Zhang, Qing; Zhang, Li; Niu, Xiaohui

    2017-09-01

    Clinical reports suggest that large bone defects could be effectively restored by allograft bone transplantation, where allograft bone selection acts an important role. Besides, there is a huge demand for developing the automatic allograft bone selection methods, as the automatic methods could greatly improve the management efficiency of the large bone banks. Although several automatic methods have been presented to select the most suitable allograft bone from the massive allograft bone bank, these methods still suffer from inaccuracy. In this paper, we propose an effective allograft bone selection method without using the contralateral bones. Firstly, the allograft bone is globally aligned to the recipient bone by surface registration. Then, the global alignment is further refined through band registration. The band, defined as the recipient points within the lifted and lowered cutting planes, could involve more local structure of the defected segment. Therefore, our method could achieve robust alignment and high registration accuracy of the allograft and recipient. Moreover, the existing contour method and surface method could be unified into one framework under our method by adjusting the lift and lower distances of the cutting planes. Finally, our method has been validated on the database of distal femurs. The experimental results indicate that our method outperforms the surface method and contour method.

  5. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  6. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  7. Observed use of automatic seat belts in 1987 cars.

    Science.gov (United States)

    Williams, A F; Wells, J K; Lund, A K; Teed, N

    1989-10-01

    Usage of the automatic belt systems supplied by six large-volume automobile manufacturers to meet the federal requirements for automatic restraints were observed in suburban Washington, D.C., Chicago, Los Angeles, and Philadelphia. The different belt systems studied were: Ford and Toyota (motorized, nondetachable automatic shoulder belt), Nissan (motorized, detachable shoulder belt), VW and Chrysler (nonmotorized, detachable shoulder belt), and GM (nonmotorized detachable lap and shoulder belt). Use of automatic belts was significantly greater than manual belt use in otherwise comparable late-model cars for all manufacturers except Chrysler; in Chrysler cars, automatic belt use was significantly lower than manual belt use. The automatic shoulder belts provided by Ford, Nissan, Toyota, and VW increased use rates to about 90%. Because use rates were lower in Ford cars with manual belts, their increase was greater. GM cars had the smallest increase in use rates; however, lap belt use was highest in GM cars. The other manufacturers supply knee bolsters to supplement shoulder belt protection; all--except VW--also provide manual lap belts, which were used by about half of those who used the automatic shoulder belt. The results indicate that some manufacturers have been more successful than others in providing automatic belt systems that result in high use that, in turn, will mean fewer deaths and injuries in those cars.

  8. The Effectiveness of Classroom Capture Technology

    Science.gov (United States)

    Ford, Maire B.; Burns, Colleen E.; Mitch, Nathan; Gomez, Melissa M.

    2012-01-01

    The use of classroom capture systems (systems that capture audio and video footage of a lecture and attempt to replicate a classroom experience) is becoming increasingly popular at the university level. However, research on the effectiveness of classroom capture systems in the university classroom has been limited due to the recent development and…

  9. Contingent orienting or contingent capture: a size singleton matching the target-distractor size relation cannot capture attention.

    Science.gov (United States)

    Du, Feng; Yin, Yue; Qi, Yue; Zhang, Kan

    2014-08-01

    In the present study, we examined whether a peripheral size-singleton distractor that matches the target-distractor size relation can capture attention and disrupt central target identification. Three experiments consistently showed that a size singleton that matches the target-distractor size relation cannot capture attention when it appears outside of the attentional window, even though the same size singleton produces a cuing effect. In addition, a color singleton that matches the target color, instead of a size singleton that matches the target-distractor size relation, captures attention when it is outside of the attentional window. Thus, a size-relation-matched distractor is much weaker than a color-matched distractor in capturing attention and cannot capture attention when the distractor appears outside of the attentional window.

  10. Multimodal Translation System Using Texture-Mapped Lip-Sync Images for Video Mail and Automatic Dubbing Applications

    Directory of Open Access Journals (Sweden)

    Nakamura Satoshi

    2004-01-01

    Full Text Available We introduce a multimodal English-to-Japanese and Japanese-to-English translation system that also translates the speaker's speech motion by synchronizing it to the translated speech. This system also introduces both a face synthesis technique that can generate any viseme lip shape and a face tracking technique that can estimate the original position and rotation of a speaker's face in an image sequence. To retain the speaker's facial expression, we substitute only the speech organ's image with the synthesized one, which is made by a 3D wire-frame model that is adaptable to any speaker. Our approach provides translated image synthesis with an extremely small database. The tracking motion of the face from a video image is performed by template matching. In this system, the translation and rotation of the face are detected by using a 3D personal face model whose texture is captured from a video frame. We also propose a method to customize the personal face model by using our GUI tool. By combining these techniques and the translated voice synthesis technique, an automatic multimodal translation can be achieved that is suitable for video mail or automatic dubbing systems into other languages.

  11. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  12. Capture cross sections on unstable nuclei

    Science.gov (United States)

    Tonchev, A. P.; Escher, J. E.; Scielzo, N.; Bedrossian, P.; Ilieva, R. S.; Humby, P.; Cooper, N.; Goddard, P. M.; Werner, V.; Tornow, W.; Rusev, G.; Kelley, J. H.; Pietralla, N.; Scheck, M.; Savran, D.; Löher, B.; Yates, S. W.; Crider, B. P.; Peters, E. E.; Tsoneva, N.; Goriely, S.

    2017-09-01

    Accurate neutron-capture cross sections on unstable nuclei near the line of beta stability are crucial for understanding the s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. Essential ingredients for describing the γ decays following neutron capture are the γ-ray strength function and level densities. We will compare different indirect approaches for obtaining the most relevant observables that can constrain Hauser-Feshbach statistical-model calculations of capture cross sections. Specifically, we will consider photon scattering using monoenergetic and 100% linearly polarized photon beams. Challenges that exist on the path to obtaining neutron-capture cross sections for reactions on isotopes near and far from stability will be discussed.

  13. Capture cross sections on unstable nuclei

    Directory of Open Access Journals (Sweden)

    Tonchev A.P.

    2017-01-01

    Full Text Available Accurate neutron-capture cross sections on unstable nuclei near the line of beta stability are crucial for understanding the s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. Essential ingredients for describing the γ decays following neutron capture are the γ-ray strength function and level densities. We will compare different indirect approaches for obtaining the most relevant observables that can constrain Hauser-Feshbach statistical-model calculations of capture cross sections. Specifically, we will consider photon scattering using monoenergetic and 100% linearly polarized photon beams. Challenges that exist on the path to obtaining neutron-capture cross sections for reactions on isotopes near and far from stability will be discussed.

  14. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  15. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    Science.gov (United States)

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  16. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    Science.gov (United States)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  17. AUTOMATIC FAULT RECOGNITION OF PHOTOVOLTAIC MODULES BASED ON STATISTICAL ANALYSIS OF UAV THERMOGRAPHY

    Directory of Open Access Journals (Sweden)

    D. Kim

    2017-08-01

    Full Text Available As a malfunctioning PV (Photovoltaic cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle. The proposed algorithm uses statistical analysis of thermal intensity (surface temperature characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  18. Automatic Encoding and Language Detection in the GSDL

    Directory of Open Access Journals (Sweden)

    Otakar Pinkas

    2014-10-01

    Full Text Available Automatic detection of encoding and language of the text is part of the Greenstone Digital Library Software (GSDL for building and distributing digital collections. It is developed by the University of Waikato (New Zealand in cooperation with UNESCO. The automatic encoding and language detection in Slavic languages is difficult and it sometimes fails. The aim is to detect cases of failure. The automatic detection in the GSDL is based on n-grams method. The most frequent n-grams for Czech are presented. The whole process of automatic detection in the GSDL is described. The input documents to test collections are plain texts encoded in ISO-8859-1, ISO-8859-2 and Windows-1250. We manually evaluated the quality of automatic detection. To the causes of errors belong the improper language model predominance and the incorrect switch to Windows-1250. We carried out further tests on documents that were more complex.

  19. Automatic computation of cross sections in HEP. Status of GRACE system

    International Nuclear Information System (INIS)

    Yuasa, F.; Fujimoto, J.; Ishikawa, T.

    2000-01-01

    For the study of reactions in High Energy Physics (HEP) automatic computation systems have been developed are widely used nowadays. GRACE is one of such systems and it has achieved much success in analyzing experimental data. Since we deal with the cross section whose value can be given by calculating hundreds of Feynman diagrams, we manage the large scale calculation, so that effective symbolic manipulation, the treat of singularity in the numerical integration are required. The talk will describe the software design of GRACE system and computational techniques in the GRACE. (author)

  20. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  1. A Wireless Sensor Network-Based Ubiquitous Paprika Growth Management System

    Directory of Open Access Journals (Sweden)

    Jeonghwan Hwang

    2010-12-01

    Full Text Available Wireless Sensor Network (WSN technology can facilitate advances in productivity, safety and human quality of life through its applications in various industries. In particular, the application of WSN technology to the agricultural area, which is labor-intensive compared to other industries, and in addition is typically lacking in IT technology applications, adds value and can increase the agricultural productivity. This study attempts to establish a ubiquitous agricultural environment and improve the productivity of farms that grow paprika by suggesting a ‘Ubiquitous Paprika Greenhouse Management System’ using WSN technology. The proposed system can collect and monitor information related to the growth environment of crops outside and inside paprika greenhouses by installing WSN sensors and monitoring images captured by CCTV cameras. In addition, the system provides a paprika greenhouse environment control facility for manual and automatic control from a distance, improves the convenience and productivity of users, and facilitates an optimized environment to grow paprika based on the growth environment data acquired by operating the system.

  2. Automatic Web-Based, Radio-Network System To Monitor And Control Equipment For Investigating Gas Flux At Water - Air Interfaces

    Science.gov (United States)

    Duc, N. T.; Silverstein, S.; Wik, M.; Beckman, P.; Crill, P. M.; Bastviken, D.; Varner, R. K.

    2015-12-01

    Aquatic ecosystems are major sources of greenhouse gases (GHG). Robust measurements of natural GHG emissions are vital for evaluating regional to global carbon budgets and for assessing climate feedbacks on natural emissions to improve climate models. Diffusive and ebullitive (bubble) transport are two major pathways of gas release from surface waters. To capture the high temporal variability of these fluxes in a well-defined footprint, we designed and built an inexpensive automatic device that includes an easily mobile diffusive flux chamber and a bubble counter, all in one. Besides a function of automatically collecting gas samples for subsequent various analyses in the laboratory, this device utilizes low cost CO2 sensor (SenseAir, Sweden) and CH4 sensor (Figaro, Japan) to measure GHG fluxes. To measure the spatial variability of emissions, each of the devices is equipped with an XBee module to enable a local radio communication DigiMesh network for time synchronization and data readout at a server-controller station on the lakeshore. Software of this server-controller is operated on a low cost Raspberry Pi computer which has a 3G connection for remote monitoring - controlling functions from anywhere in the world. From field studies in Abisko, Sweden in summer 2014 and 2015, the system has resulted in measurements of GHG fluxes comparable to manual methods. In addition, the deployments have shown the advantage of a low cost automatic network system to study GHG fluxes on lakes in remote locations.

  3. Capture and playback synchronization in video conferencing

    Science.gov (United States)

    Shae, Zon-Yin; Chang, Pao-Chi; Chen, Mon-Song

    1995-03-01

    Packet-switching based video conferencing has emerged as one of the most important multimedia applications. Lip synchronization can be disrupted in the packet network as the result of the network properties: packet delay jitters at the capture end, network delay jitters, packet loss, packet arrived out of sequence, local clock mismatch, and video playback overlay with the graphic system. The synchronization problem become more demanding as the real time and multiparty requirement of the video conferencing application. Some of the above mentioned problem can be solved in the more advanced network architecture as ATM having promised. This paper will present some of the solutions to the problems that can be useful at the end station terminals in the massively deployed packet switching network today. The playback scheme in the end station will consist of two units: compression domain buffer management unit and the pixel domain buffer management unit. The pixel domain buffer management unit is responsible for removing the annoying frame shearing effect in the display. The compression domain buffer management unit is responsible for parsing the incoming packets for identifying the complete data blocks in the compressed data stream which can be decoded independently. The compression domain buffer management unit is also responsible for concealing the effects of clock mismatch, lip synchronization, and packet loss, out of sequence, and network jitters. This scheme can also be applied to the multiparty teleconferencing environment. Some of the schemes presented in this paper have been implemented in the Multiparty Multimedia Teleconferencing (MMT) system prototype at the IBM watson research center.

  4. Some experimental results for an automatic helium liquefier

    International Nuclear Information System (INIS)

    Watanabe, T.; Kudo, T.; Kuraoka, Y.; Sakura, K.; Tsuruga, H.; Watanabe, T.

    1984-01-01

    This chapter describes the testing of an automatic cooldown system. The liquefying machine examined is a CTi Model 1400. The automatic helium gas liquefying system is operated by using sequence control with a programmable controller. The automatic mode is carried out by operation of two compressors. The monitoring system consists of 41 remote sensors. Liquid level is measured by a superconducting level meter. The J-T valve and return valve, which require precise control, are operated by pulse motors. The advantages of the automatic cooldown system are reduced operator man power; temperatures and pressures are changed smoothly, so that the flow chart of automation is simple; and the system makes continuous liquefier operation possible

  5. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  6. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  7. 46 CFR 171.118 - Automatic ventilators and side ports.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Automatic ventilators and side ports. 171.118 Section 171.118 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY... Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  8. 30 CFR 75.1404 - Automatic brakes; speed reduction gear.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic brakes; speed reduction gear. 75.1404... Automatic brakes; speed reduction gear. [Statutory Provisions] Each locomotive and haulage car used in an... permit automatic brakes, locomotives and haulage cars shall be subject to speed reduction gear, or other...

  9. Automatic Error Recovery in Robot Assembly Operations Using Reverse Execution

    DEFF Research Database (Denmark)

    Laursen, Johan Sund; Schultz, Ulrik Pagh; Ellekilde, Lars-Peter

    2015-01-01

    , in particular for small-batch productions. As an alternative, we propose a system for automatically handling certain classes of errors instead of preventing them. Specifically, we show that many operations can be automatically reversed. Errors can be handled through automatic reverse execution of the control...... program to a safe point, from which forward execution can be resumed. This paper describes the principles behind automatic reversal of robotic assembly operations, and experimentally demonstrates the use of a domain-specific language that supports automatic error handling through reverse execution. Our...

  10. Usability and Acceptance of the Librarian Infobutton Tailoring Environment: An Open Access Online Knowledge Capture, Management, and Configuration Tool for OpenInfobutton.

    Science.gov (United States)

    Jing, Xia; Cimino, James J; Del Fiol, Guilherme

    2015-11-30

    The Librarian Infobutton Tailoring Environment (LITE) is a Web-based knowledge capture, management, and configuration tool with which users can build profiles used by OpenInfobutton, an open source infobutton manager, to provide electronic health record users with context-relevant links to online knowledge resources. We conducted a multipart evaluation study to explore users' attitudes and acceptance of LITE and to guide future development. The evaluation consisted of an initial online survey to all LITE users, followed by an observational study of a subset of users in which evaluators' sessions were recorded while they conducted assigned tasks. The observational study was followed by administration of a modified System Usability Scale (SUS) survey. Fourteen users responded to the survey and indicated good acceptance of LITE with feedback that was mostly positive. Six users participated in the observational study, demonstrating average task completion time of less than 6 minutes and an average SUS score of 72, which is considered good compared with other SUS scores. LITE can be used to fulfill its designated tasks quickly and successfully. Evaluators proposed suggestions for improvements in LITE functionality and user interface.

  11. Automatically measuring brain ventricular volume within PACS using artificial intelligence.

    Science.gov (United States)

    Yepes-Calderon, Fernando; Nelson, Marvin D; McComb, J Gordon

    2018-01-01

    The picture archiving and communications system (PACS) is currently the standard platform to manage medical images but lacks analytical capabilities. Staying within PACS, the authors have developed an automatic method to retrieve the medical data and access it at a voxel level, decrypted and uncompressed that allows analytical capabilities while not perturbing the system's daily operation. Additionally, the strategy is secure and vendor independent. Cerebral ventricular volume is important for the diagnosis and treatment of many neurological disorders. A significant change in ventricular volume is readily recognized, but subtle changes, especially over longer periods of time, may be difficult to discern. Clinical imaging protocols and parameters are often varied making it difficult to use a general solution with standard segmentation techniques. Presented is a segmentation strategy based on an algorithm that uses four features extracted from the medical images to create a statistical estimator capable of determining ventricular volume. When compared with manual segmentations, the correlation was 94% and holds promise for even better accuracy by incorporating the unlimited data available. The volume of any segmentable structure can be accurately determined utilizing the machine learning strategy presented and runs fully automatically within the PACS.

  12. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  13. Theory of inelastic multiphonon scattering and carrier capture by defects in semiconductors: Application to capture cross sections

    Science.gov (United States)

    Barmparis, Georgios D.; Puzyrev, Yevgeniy S.; Zhang, X.-G.; Pantelides, Sokrates T.

    2015-12-01

    Inelastic scattering and carrier capture by defects in semiconductors are the primary causes of hot-electron-mediated degradation of power devices, which holds up their commercial development. At the same time, carrier capture is a major issue in the performance of solar cells and light-emitting diodes. A theory of nonradiative (multiphonon) inelastic scattering by defects, however, is nonexistent, while the theory for carrier capture by defects has had a long and arduous history. Here we report the construction of a comprehensive theory of inelastic scattering by defects, with carrier capture being a special case. We distinguish between capture under thermal equilibrium conditions and capture under nonequilibrium conditions, e.g., in the presence of an electrical current or hot carriers where carriers undergo scattering by defects and are described by a mean free path. In the thermal-equilibrium case, capture is mediated by a nonadiabatic perturbation Hamiltonian, originally identified by Huang and Rhys and by Kubo, which is equal to linear electron-phonon coupling to first order. In the nonequilibrium case, we demonstrate that the primary capture mechanism is within the Born-Oppenheimer approximation (adiabatic transitions), with coupling to the defect potential inducing Franck-Condon electronic transitions, followed by multiphonon dissipation of the transition energy, while the nonadiabatic terms are of secondary importance (they scale with the inverse of the mass of typical atoms in the defect complex). We report first-principles density-functional-theory calculations of the capture cross section for a prototype defect using the projector-augmented wave, which allows us to employ all-electron wave functions. We adopt a Monte Carlo scheme to sample multiphonon configurations and obtain converged results. The theory and the results represent a foundation upon which to build engineering-level models for hot-electron degradation of power devices and the performance

  14. Does imminent threat capture and hold attention?

    Science.gov (United States)

    Koster, Ernst H W; Crombez, Geert; Van Damme, Stefaan; Verschuere, Bruno; De Houwer, Jan

    2004-09-01

    According to models of attention and emotion, threat captures and holds attention. In behavioral tasks, robust evidence has been found for attentional holding but not for attentional capture by threat. An important explanation for the absence of attentional capture effects is that the visual stimuli used posed no genuine threat. The present study investigated whether visual cues that signal an aversive white noise can elicit attentional capture and holding effects. Cues presented in an attentional task were simultaneously provided with a threat value through an aversive conditioning procedure. Response latencies showed that threatening cues captured and held attention. These results support recent views on attention to threat, proposing that imminent threat captures attention in everyone. (c) 2004 APA, all rights reserved

  15. Automatic detection of subglacial lakes in radar sounder data acquired in Antarctica

    Science.gov (United States)

    Ilisei, Ana-Maria; Khodadadzadeh, Mahdi; Dalsasso, Emanuele; Bruzzone, Lorenzo

    2017-10-01

    Subglacial lakes decouple the ice sheet from the underlying bedrock, thus facilitating the sliding of the ice masses towards the borders of the continents, consequently raising the sea level. This motivated increasing attention in the detection of subglacial lakes. So far, about 70% of the total number of subglacial lakes in Antarctica have been detected by analysing radargrams acquired by radar sounder (RS) instruments. Although the amount of radargrams is expected to drastically increase, from both airborne and possible future Earth observation RS missions, currently the main approach to the detection of subglacial lakes in radargrams is by visual interpretation. This approach is subjective and extremely time consuming, thus difficult to apply to a large amount of radargrams. In order to address the limitations of the visual interpretation and to assist glaciologists in better understanding the relationship between the subglacial environment and the climate system, in this paper, we propose a technique for the automatic detection of subglacial lakes. The main contribution of the proposed technique is the extraction of features for discriminating between lake and non-lake basal interfaces. In particular, we propose the extraction of features that locally capture the topography of the basal interface, the shape and the correlation of the basal waveforms. Then, the extracted features are given as input to a supervised binary classifier based on Support Vector Machine to perform the automatic subglacial lake detection. The effectiveness of the proposed method is proven both quantitatively and qualitatively by applying it to a large dataset acquired in East Antarctica by the MultiChannel Coherent Radar Depth Sounder.

  16. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  17. Recent development of capture of CO2

    CERN Document Server

    Chavez, Rosa Hilda

    2014-01-01

    "Recent Technologies in the capture of CO2" provides a comprehensive summary on the latest technologies available to minimize the emission of CO2 from large point sources like fossil-fuel power plants or industrial facilities. This ebook also covers various techniques that could be developed to reduce the amount of CO2 released into the atmosphere. The contents of this book include chapters on oxy-fuel combustion in fluidized beds, gas separation membrane used in post-combustion capture, minimizing energy consumption in CO2 capture processes through process integration, characterization and application of structured packing for CO2 capture, calcium looping technology for CO2 capture and many more. Recent Technologies in capture of CO2 is a valuable resource for graduate students, process engineers and administrative staff looking for real-case analysis of pilot plants. This eBook brings together the research results and professional experiences of the most renowned work groups in the CO2 capture field...

  18. Digital movie-based on automatic titrations.

    Science.gov (United States)

    Lima, Ricardo Alexandre C; Almeida, Luciano F; Lyra, Wellington S; Siqueira, Lucas A; Gaião, Edvaldo N; Paiva Junior, Sérgio S L; Lima, Rafaela L F C

    2016-01-15

    This study proposes the use of digital movies (DMs) in a flow-batch analyzer (FBA) to perform automatic, fast and accurate titrations. The term used for this process is "Digital movie-based on automatic titrations" (DMB-AT). A webcam records the DM during the addition of the titrant to the mixing chamber (MC). While the DM is recorded, it is decompiled into frames ordered sequentially at a constant rate of 26 frames per second (FPS). The first frame is used as a reference to define the region of interest (ROI) of 28×13pixels and the R, G and B values, which are used to calculate the Hue (H) values for each frame. The Pearson's correlation coefficient (r) is calculated between the H values of the initial frame and each subsequent frame. The titration curves are plotted in real time using the r values and the opening time of the titrant valve. The end point is estimated by the second derivative method. A software written in C language manages all analytical steps and data treatment in real time. The feasibility of the method was attested by application in acid/base test samples and edible oils. Results were compared with classical titration and did not present statistically significant differences when the paired t-test at the 95% confidence level was applied. The proposed method is able to process about 117-128 samples per hour for the test and edible oil samples, respectively, and its precision was confirmed by overall relative standard deviation (RSD) values, always less than 1.0%. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Automatic shadowing device for electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, F W; Bogitch, S

    1960-01-01

    For the past ten years in the laboratory of the Department of Nuclear Medicine and Radiation Biology at the University of California, and before that at Rochester, New York, every evaporation was done with the aid of an automatic shadowing device. For several months the automatic shadowing device has been available at the Atomic Bomb Casualty Commission (ABCC) Hiroshima, Japan with the modifications described. 1 reference.

  20. Enzymes in CO2 Capture

    DEFF Research Database (Denmark)

    Fosbøl, Philip Loldrup; Gladis, Arne; Thomsen, Kaj

    The enzyme Carbonic Anhydrase (CA) can accelerate the absorption rate of CO2 into aqueous solutions by several-fold. It exist in almost all living organisms and catalyses different important processes like CO2 transport, respiration and the acid-base balances. A new technology in the field...... of carbon capture is the application of enzymes for acceleration of typically slow ternary amines or inorganic carbonates. There is a hidden potential to revive currently infeasible amines which have an interesting low energy consumption for regeneration but too slow kinetics for viable CO2 capture. The aim...... of this work is to discuss the measurements of kinetic properties for CA promoted CO2 capture solvent systems. The development of a rate-based model for enzymes will be discussed showing the principles of implementation and the results on using a well-known ternary amine for CO2 capture. Conclusions...

  1. Partial radiative capture of resonance neutrons; Capture radiative partielle des neutrons de resonance

    Energy Technology Data Exchange (ETDEWEB)

    Samour, C [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    The radiative capture of resonance neutrons has been studied near the Saclay linac between 0.5 and 700 eV with time-of-flight method and a Ge(Li) detector. {sup 195}Pt + n and {sup 183}W + n allow the study of the distribution of partial radiative widths and their eventual correlation and also the variation of < {gamma}{sub {gamma}{sub i}} > with E{sub {gamma}}. The mean values of Ml and El transition intensities are compared in several tin isotopes. Interference effects, either between resonances or between direct capture and resonant capture are found in {sup 195}Pt + n, {sup 197}Au + n and {sup 59}Co + n. The excited level schemes of a great deal of nuclei are obtained and compared with theoretical predictions. This study has been completed by an analysis of thermal spectrum. (author) [French] La capture radiative des neutrons de resonance a ete etudiee pres de l'accelerateur lineaire de Saclay entre 0,5 et 700 eV a l'aide de la methode du temps-de-vol et d'un detecteur Ge(Li). Les noyaux {sup 195}Pt + n et {sup 183}W + n permettent l'analyse de la distribution de resonance en resonance des largeurs radiatives partielles {gamma}{sub {gamma}{sub i}} et de leur eventuelle correlation, ainsi que l'etude de la variation de < {gamma}{sub {gamma}{sub i}} > en fonction de E{sub {gamma}}. Les intensites moyennes des transitions Ml et El sont comparees pour quelques isotopes de l'etain. Des effets d'interference, soit entre resonances, soit entre capture directe et capture resonnante sont mis en evidence dans {sup 195}Pt + n, {sup 197}Au + n et {sup 59}Co + n. Enfin les schemas des etats excites d'un grand nombre de noyaux sont obtenus et compares avec les predictions theoriques. Cette etude a ete completee par une analyse des spectres thermiques. (auteur)

  2. Partial radiative capture of resonance neutrons; Capture radiative partielle des neutrons de resonance

    Energy Technology Data Exchange (ETDEWEB)

    Samour, C. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    The radiative capture of resonance neutrons has been studied near the Saclay linac between 0.5 and 700 eV with time-of-flight method and a Ge(Li) detector. {sup 195}Pt + n and {sup 183}W + n allow the study of the distribution of partial radiative widths and their eventual correlation and also the variation of < {gamma}{sub {gamma}{sub i}} > with E{sub {gamma}}. The mean values of Ml and El transition intensities are compared in several tin isotopes. Interference effects, either between resonances or between direct capture and resonant capture are found in {sup 195}Pt + n, {sup 197}Au + n and {sup 59}Co + n. The excited level schemes of a great deal of nuclei are obtained and compared with theoretical predictions. This study has been completed by an analysis of thermal spectrum. (author) [French] La capture radiative des neutrons de resonance a ete etudiee pres de l'accelerateur lineaire de Saclay entre 0,5 et 700 eV a l'aide de la methode du temps-de-vol et d'un detecteur Ge(Li). Les noyaux {sup 195}Pt + n et {sup 183}W + n permettent l'analyse de la distribution de resonance en resonance des largeurs radiatives partielles {gamma}{sub {gamma}{sub i}} et de leur eventuelle correlation, ainsi que l'etude de la variation de < {gamma}{sub {gamma}{sub i}} > en fonction de E{sub {gamma}}. Les intensites moyennes des transitions Ml et El sont comparees pour quelques isotopes de l'etain. Des effets d'interference, soit entre resonances, soit entre capture directe et capture resonnante sont mis en evidence dans {sup 195}Pt + n, {sup 197}Au + n et {sup 59}Co + n. Enfin les schemas des etats excites d'un grand nombre de noyaux sont obtenus et compares avec les predictions theoriques. Cette etude a ete completee par une analyse des spectres thermiques. (auteur)

  3. A synthesized mating pheromone component increases adult sea lamprey (Petromyzon marinus) trap capture in management scenarios

    Science.gov (United States)

    Johnson, Nicholas S.; Siefkes, Michael J.; Wagner, C. Michael; Dawson, Heather; Wang, Huiyong; Steeves, Todd; Twohey, Michael; Li, Weiming

    2013-01-01

    Application of chemical cues to manipulate adult sea lamprey (Petromyzon marinus) behavior is among the options considered for new sea lamprey control techniques in the Laurentian Great Lakes. A male mating pheromone component, 7a,12a,24-trihydroxy-3-one-5a-cholan-24-sulfate (3kPZS), lures ovulated female sea lamprey upstream into baited traps in experimental contexts with no odorant competition. A critical knowledge gap is whether this single pheromone component influences adult sea lamprey behavior in management contexts containing free-ranging sea lampreys. A solution of 3kPZS to reach a final in-stream concentration of 10-12 mol·L-1 was applied to eight Michigan streams at existing sea lamprey traps over 3 years, and catch rates were compared between paired 3kPZS-baited and unbaited traps. 3kPZS-baited traps captured significantly more sexually immature and mature sea lampreys, and overall yearly trapping efficiency within a stream averaged 10% higher during years when 3kPZS was applied. Video analysis of a trap funnel showed that the likelihood of sea lamprey trap entry after trap encounter was higher when the trap was 3kPZS baited. Our approach serves as a model for the development of similar control tools for sea lamprey and other aquatic invaders.

  4. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  5. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  6. A Unification of Inheritance and Automatic Program Specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2004-01-01

    , inheritance is used to control the automatic application of program specialization to class members during compilation to obtain an efficient implementation. This paper presents the language JUST, which integrates object-oriented concepts, block structure, and techniques from automatic program specialization......The object-oriented style of programming facilitates program adaptation and enhances program genericness, but at the expense of efficiency. Automatic program specialization can be used to generate specialized, efficient implementations for specific scenarios, but requires the program...... to be structured appropriately for specialization and is yet another new concept for the programmer to understand and apply. We have unified automatic program specialization and inheritance into a single concept, and implemented this approach in a modified version of Java named JUST. When programming in JUST...

  7. Electron capture and stellar collapse

    International Nuclear Information System (INIS)

    Chung, K.C.

    1979-01-01

    In order, to investigate the function of electron capture in the phenomenon of pre-supernovae gravitacional collapse, an hydrodynamic caculation was carried out, coupling capture, decay and nuclear reaction equation system. A star simplified model (homogeneous model) was adopted using fermi ideal gas approximation for tthe sea of free electrons and neutrons. The non simplified treatment from quasi-static evolution to collapse is presented. The capture and beta decay rates, as wellas neutron delayed emission, were calculated by beta decay crude theory, while the other reaction rates were determined by usual theories. The preliminary results are presented. (M.C.K.) [pt

  8. Proton capture by magnetic monopoles

    International Nuclear Information System (INIS)

    Olaussen, K.; Olsen, H.A.; Oeverboe, I.; Osland, P.

    1983-09-01

    In the Kazama-Yang approximation, the lowest monopole-proton bound states have binding energies of 938 MeV, 263 keV, 105 eV, and 0.04 eV. The cross section for radiative capture to these states is for velocities β = 10 -5 - 10 -3 found to be of the order of 10 -28 - 10 -26 cm 2 . For the state that has a binding energy of 263 keV, the capture length in water is 171 x (β/10 -4 )sup(0.48) m. Observation of photons from the capture process would indicate the presence of monopoles. (orig.)

  9. CHAOTIC CAPTURE OF NEPTUNE TROJANS

    International Nuclear Information System (INIS)

    Nesvorny, David; Vokrouhlicky, David

    2009-01-01

    Neptune Trojans (NTs) are swarms of outer solar system objects that lead/trail planet Neptune during its revolutions around the Sun. Observations indicate that NTs form a thick cloud of objects with a population perhaps ∼10 times more numerous than that of Jupiter Trojans and orbital inclinations reaching ∼25 deg. The high inclinations of NTs are indicative of capture instead of in situ formation. Here we study a model in which NTs were captured by Neptune during planetary migration when secondary resonances associated with the mean-motion commensurabilities between Uranus and Neptune swept over Neptune's Lagrangian points. This process, known as chaotic capture, is similar to that previously proposed to explain the origin of Jupiter's Trojans. We show that chaotic capture of planetesimals from an ∼35 Earth-mass planetesimal disk can produce a population of NTs that is at least comparable in number to that inferred from current observations. The large orbital inclinations of NTs are a natural outcome of chaotic capture. To obtain the ∼4:1 ratio between high- and low-inclination populations suggested by observations, planetary migration into a dynamically excited planetesimal disk may be required. The required stirring could have been induced by Pluto-sized and larger objects that have formed in the disk.

  10. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  11. Metabolic changes in occipital lobe epilepsy with automatisms.

    Science.gov (United States)

    Wong, Chong H; Mohamed, Armin; Wen, Lingfeng; Eberl, Stefan; Somerville, Ernest; Fulham, Michael; Bleasel, Andrew F

    2014-01-01

    Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE) that may reflect propagation of ictal discharge during seizures with automatisms. Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron-emission tomography ((18)F-FDG-PET) between 1994 and 2004 were divided into two groups (with and without automatisms during seizure). Significant regions of hypometabolism were identified by comparing (18)F-FDG-PET results from each group with 16 healthy controls by using statistical parametric mapping. Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe. We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  12. 30 CFR 75.1403-4 - Criteria-Automatic elevators.

    Science.gov (United States)

    2010-07-01

    ... appropriate on automatic elevators which will automatically shut-off the power and apply the brakes in the... telephone or other effective communication system by which aid or assistance can be obtained promptly. ...

  13. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  14. Fish welfare in capture fisheries

    NARCIS (Netherlands)

    Veldhuizen, L.J.L.; Berentsen, P.B.M.; Boer, de I.J.M.; Vis, van de J.W.; Bokkers, E.A.M.

    2018-01-01

    Concerns about the welfare of production animals have extended from farm animals to fish, but an overview of the impact of especially capture fisheries on fish welfare is lacking. This review provides a synthesis of 85 articles, which demonstrates that research interest in fish welfare in capture

  15. Reactive or proactive approach towards sustainability? A conceptual framework based on sustainable business models to increase stakeholders' sustainable value capture

    DEFF Research Database (Denmark)

    Rosati, Francesco; Morioka, Sandra; Monteiro de Carvalho, Marly

    2016-01-01

    and challenging companies to seek for business opportunities with an entrepreneurial attitude to help solving sustainable development challenges. By combining both approaches, organizations have the opportunity to increase sustainable value capture by its stakeholders, acting on their institutional responsibility...... as instrument to help companies describe, analyze, manage and communicate their sustainable value proposition, creation, delivery and capture mechanism. In particular, this research focuses on value capture dynamics, aiming to explore how companies can increase their contribution to sustainable development...... sustainability. In this sense, a proactive approach to foster sustainable value capture can complement the reactive approach by delivering value beyond stakeholders' expectations. In this case, companies use their capabilities to identify opportunities to create and deliver sustainable value that stakeholders...

  16. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  17. Analysis of capture-recapture data

    CERN Document Server

    McCrea, Rachel S

    2014-01-01

    An important first step in studying the demography of wild animals is to identify the animals uniquely through applying markings, such as rings, tags, and bands. Once the animals are encountered again, researchers can study different forms of capture-recapture data to estimate features, such as the mortality and size of the populations. Capture-recapture methods are also used in other areas, including epidemiology and sociology.With an emphasis on ecology, Analysis of Capture-Recapture Data covers many modern developments of capture-recapture and related models and methods and places them in the historical context of research from the past 100 years. The book presents both classical and Bayesian methods.A range of real data sets motivates and illustrates the material and many examples illustrate biometry and applied statistics at work. In particular, the authors demonstrate several of the modeling approaches using one substantial data set from a population of great cormorants. The book also discusses which co...

  18. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  19. Automatic needle insertion diminishes pain during growth hormone injection

    DEFF Research Database (Denmark)

    Main, K M; Jørgensen, J T; Hertel, N T

    1995-01-01

    prototype pens for GH administration, providing either manual or automatic sc needle insertion, using a combined visual analogue/facial scale and a five-item scale in 18 children. With the automatic pen there was a significantly lower maximum pain score compared with the manual pen (median 28.5 versus 52.......0 mm) as well as a lower mean pain score (mean 13.7 versus 23.5 mm). The five-item scale revealed that automatic needle insertion was significantly less painful than manual insertion and 13 patients chose to continue treatment with the automatic pen. In conclusion, pain during GH injection can...

  20. Automatic Matching of Large Scale Images and Terrestrial LIDAR Based on App Synergy of Mobile Phone

    Science.gov (United States)

    Xia, G.; Hu, C.

    2018-04-01

    The digitalization of Cultural Heritage based on ground laser scanning technology has been widely applied. High-precision scanning and high-resolution photography of cultural relics are the main methods of data acquisition. The reconstruction with the complete point cloud and high-resolution image requires the matching of image and point cloud, the acquisition of the homonym feature points, the data registration, etc. However, the one-to-one correspondence between image and corresponding point cloud depends on inefficient manual search. The effective classify and management of a large number of image and the matching of large image and corresponding point cloud will be the focus of the research. In this paper, we propose automatic matching of large scale images and terrestrial LiDAR based on APP synergy of mobile phone. Firstly, we develop an APP based on Android, take pictures and record related information of classification. Secondly, all the images are automatically grouped with the recorded information. Thirdly, the matching algorithm is used to match the global and local image. According to the one-to-one correspondence between the global image and the point cloud reflection intensity image, the automatic matching of the image and its corresponding laser radar point cloud is realized. Finally, the mapping relationship between global image, local image and intensity image is established according to homonym feature point. So we can establish the data structure of the global image, the local image in the global image, the local image corresponding point cloud, and carry on the visualization management and query of image.