WorldWideScience

Sample records for project automatic prediction

  1. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is a key component of solutions to many healthcare problems. Among all predictive modeling approaches, machine learning methods often achieve the highest prediction accuracy, but suffer from a long-standing open problem precluding their widespread use in healthcare. Most machine learning models give no explanation for their prediction results, whereas interpretability is essential for a predictive model to be adopted in typical healthcare settings. This paper presents the first complete method for automatically explaining results for any machine learning predictive model without degrading accuracy. We did a computer coding implementation of the method. Using the electronic medical record data set from the Practice Fusion diabetes classification competition containing patient records from all 50 states in the United States, we demonstrated the method on predicting type 2 diabetes diagnosis within the next year. For the champion machine learning model of the competition, our method explained prediction results for 87.4 % of patients who were correctly predicted by the model to have type 2 diabetes diagnosis within the next year. Our demonstration showed the feasibility of automatically explaining results for any machine learning predictive model without degrading accuracy.

  2. Piloted Simulation Evaluation of a Model-Predictive Automatic Recovery System to Prevent Vehicle Loss of Control on Approach

    Science.gov (United States)

    Litt, Jonathan S.; Liu, Yuan; Sowers, Thomas S.; Owen, A. Karl; Guo, Ten-Huei

    2014-01-01

    This paper describes a model-predictive automatic recovery system for aircraft on the verge of a loss-of-control situation. The system determines when it must intervene to prevent an imminent accident, resulting from a poor approach. It estimates the altitude loss that would result from a go-around maneuver at the current flight condition. If the loss is projected to violate a minimum altitude threshold, the maneuver is automatically triggered. The system deactivates to allow landing once several criteria are met. Piloted flight simulator evaluation showed the system to provide effective envelope protection during extremely unsafe landing attempts. The results demonstrate how flight and propulsion control can be integrated to recover control of the vehicle automatically and prevent a potential catastrophe.

  3. Prediction Governors for Input-Affine Nonlinear Systems and Application to Automatic Driving Control

    Directory of Open Access Journals (Sweden)

    Yuki Minami

    2018-04-01

    Full Text Available In recent years, automatic driving control has attracted attention. To achieve a satisfactory driving control performance, the prediction accuracy of the traveling route is important. If a highly accurate prediction method can be used, an accurate traveling route can be obtained. Despite the considerable efforts that have been invested in improving prediction methods, prediction errors do occur in general. Thus, a method to minimize the influence of prediction errors on automatic driving control systems is required. This need motivated us to focus on the design of a mechanism for shaping prediction signals, which is called a prediction governor. In this study, we first extended our previous study to the input-affine nonlinear system case. Then, we analytically derived a solution to an optimal design problem of prediction governors. Finally, we applied the solution to an automatic driving control system, and demonstrated its usefulness through a numerical example and an experiment using a radio controlled car.

  4. Automatic stimulation of experiments and learning based on prediction failure recognition

    NARCIS (Netherlands)

    Juarez Cordova, A.G.; Kahl, B.; Henne, T.; Prassler, E.

    2009-01-01

    In this paper we focus on the task of automatically and autonomously initiating experimentation and learning based on the recognition of prediction failure. We present a mechanism that utilizes conceptual knowledge to predict the outcome of robot actions, observes their execution and indicates when

  5. Automatic Train Operation Using Autonomic Prediction of Train Runs

    Science.gov (United States)

    Asuka, Masashi; Kataoka, Kenji; Komaya, Kiyotoshi; Nishida, Syogo

    In this paper, we present an automatic train control method adaptable to disturbed train traffic conditions. The proposed method presumes transmission of detected time of a home track clearance to trains approaching to the station by employing equipment of Digital ATC (Automatic Train Control). Using the information, each train controls its acceleration by the method that consists of two approaches. First, by setting a designated restricted speed, the train controls its running time to arrive at the next station in accordance with predicted delay. Second, the train predicts the time at which it will reach the current braking pattern generated by Digital ATC, along with the time when the braking pattern transits ahead. By comparing them, the train correctly chooses the coasting drive mode in advance to avoid deceleration due to the current braking pattern. We evaluated the effectiveness of the proposed method regarding driving conditions, energy consumption and reduction of delays by simulation.

  6. Do Judgments of Learning Predict Automatic Influences of Memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-01-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked…

  7. Team collaborative innovation management based on primary pipes automatic welding project

    International Nuclear Information System (INIS)

    Li Jing; Wang Dong; Zhang Ke

    2012-01-01

    The welding quality of primary pipe directly affects the safe operation of nuclear power plants. Primary pipe automatic welding, first of its kind in China, is a complex systematic project involving many facets, such as design, manufacturing, material, and on-site construction. A R and D team was formed by China Guangdong Nuclear Power Engineering Co., Ltd. (CNPEC) together with other domestic nuclear power design institutes, and manufacturing and construction enterprises. According to the characteristics of nuclear power plant construction, and adopting team collaborative innovation management mode, through project co-ordination, resources allocation and building production, education and research collaborative innovation platform, CNPEC successfully developed the primary pipe automatic welding technique which has been widely applied to the construction of nuclear power plant, creating considerable economic benefits. (authors)

  8. Specific predictive power of automatic spider-related affective associations for controllable and uncontrollable fear responses toward spiders

    NARCIS (Netherlands)

    Huijdlng, J; de Jong, PJ; Huijding, J.

    This study examined the predictive power of automatically activated spider-related affective associations for automatic and controllable fear responses. The Extrinsic Affective Simon Task (EAST; De Houwer, 2003) was used to indirectly assess automatic spider fear-related associations. The EAST and

  9. Automatic transfer function design for medical visualization using visibility distributions and projective color mapping.

    Science.gov (United States)

    Cai, Lile; Tay, Wei-Liang; Nguyen, Binh P; Chui, Chee-Kong; Ong, Sim-Heng

    2013-01-01

    Transfer functions play a key role in volume rendering of medical data, but transfer function manipulation is unintuitive and can be time-consuming; achieving an optimal visualization of patient anatomy or pathology is difficult. To overcome this problem, we present a system for automatic transfer function design based on visibility distribution and projective color mapping. Instead of assigning opacity directly based on voxel intensity and gradient magnitude, the opacity transfer function is automatically derived by matching the observed visibility distribution to a target visibility distribution. An automatic color assignment scheme based on projective mapping is proposed to assign colors that allow for the visual discrimination of different structures, while also reflecting the degree of similarity between them. When our method was tested on several medical volumetric datasets, the key structures within the volume were clearly visualized with minimal user intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Automatic Offline Formulation of Robust Model Predictive Control Based on Linear Matrix Inequalities Method

    Directory of Open Access Journals (Sweden)

    Longge Zhang

    2013-01-01

    Full Text Available Two automatic robust model predictive control strategies are presented for uncertain polytopic linear plants with input and output constraints. A sequence of nested geometric proportion asymptotically stable ellipsoids and controllers is constructed offline first. Then the feedback controllers are automatically selected with the receding horizon online in the first strategy. Finally, a modified automatic offline robust MPC approach is constructed to improve the closed system's performance. The new proposed strategies not only reduce the conservatism but also decrease the online computation. Numerical examples are given to illustrate their effectiveness.

  12. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  13. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  14. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data from a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating using predictive computational...

  15. Automatic tools for enhancing the collaborative experience in large projects

    International Nuclear Information System (INIS)

    Bourilkov, D; Rodriquez, J L

    2014-01-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  16. Machine learning in updating predictive models of planning and scheduling transportation projects

    Science.gov (United States)

    1997-01-01

    A method combining machine learning and regression analysis to automatically and intelligently update predictive models used in the Kansas Department of Transportations (KDOTs) internal management system is presented. The predictive models used...

  17. Automatic meter reading and PowerPlus services: Concept to implementation

    Energy Technology Data Exchange (ETDEWEB)

    Perks, D.R. [Alberta Power Ltd., Edmonton, AB (Canada)

    1995-12-31

    The Distribution Control System Inc.`s Two Way Automatic Communication System (TWACS) was implemented with GE Canada`s 170S automatic meter reader (AMR) at Alberta Power Ltd. Core and extended features are being marketed as PowerPlus{sup TM.} The technology used in the systems, design philosophy, systems components, outbound communication, inbound communication, throughput, and AMR were described. Objectives for the pilot project were to test reliability, accuracy and cost of implementation. Scope of the pilot, and project results were presented. Business aspects of PowerPlus{sup TM }marketing were described. Implementation schedule, constraints, technical problems, training, communication plan, strategy and 1994 year end status of the project were reviewed. Plans for continued development were described. It was predicted that the versatility of the TWACS system, and hard work of every department of Alberta Power will ensure that the implementation program will complete success. 5 figs.

  18. The MELANIE project: from a biopsy to automatic protein map interpretation by computer.

    Science.gov (United States)

    Appel, R D; Hochstrasser, D F; Funk, M; Vargas, J R; Pellegrini, C; Muller, A F; Scherrer, J R

    1991-10-01

    The goals of the MELANIE project are to determine if disease-associated patterns can be detected in high resolution two-dimensional polyacrylamide gel electrophoresis (HR 2D-PAGE) images and if a diagnosis can be established automatically by computer. The ELSIE/MELANIE system is a set of computer programs which automatically detect, quantify, and compare protein spots shown on HR 2D-PAGE images. Classification programs help the physician to find disease-associated patterns from a given set of two-dimensional gel electrophoresis images and to form diagnostic rules. Prototype expert systems that use these rules to establish a diagnosis from new HR 2D-PAGE images have been developed. They successfully diagnosed cirrhosis of the liver and were able to distinguish a variety of cancer types from biopsies known to be cancerous.

  19. SU-F-T-342: Dosimetric Constraint Prediction Guided Automatic Mulit-Objective Optimization for Intensity Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Song, T; Zhou, L; Li, Y

    2016-01-01

    Purpose: For intensity modulated radiotherapy, the plan optimization is time consuming with difficulties of selecting objectives and constraints, and their relative weights. A fast and automatic multi-objective optimization algorithm with abilities to predict optimal constraints and manager their trade-offs can help to solve this problem. Our purpose is to develop such a framework and algorithm for a general inverse planning. Methods: There are three main components contained in this proposed multi-objective optimization framework: prediction of initial dosimetric constraints, further adjustment of constraints and plan optimization. We firstly use our previously developed in-house geometry-dosimetry correlation model to predict the optimal patient-specific dosimetric endpoints, and treat them as initial dosimetric constraints. Secondly, we build an endpoint(organ) priority list and a constraint adjustment rule to repeatedly tune these constraints from their initial values, until every single endpoint has no room for further improvement. Lastly, we implement a voxel-independent based FMO algorithm for optimization. During the optimization, a model for tuning these voxel weighting factors respecting to constraints is created. For framework and algorithm evaluation, we randomly selected 20 IMRT prostate cases from the clinic and compared them with our automatic generated plans, in both the efficiency and plan quality. Results: For each evaluated plan, the proposed multi-objective framework could run fluently and automatically. The voxel weighting factor iteration time varied from 10 to 30 under an updated constraint, and the constraint tuning time varied from 20 to 30 for every case until no more stricter constraint is allowed. The average total costing time for the whole optimization procedure is ∼30mins. By comparing the DVHs, better OAR dose sparing could be observed in automatic generated plan, for 13 out of the 20 cases, while others are with competitive

  20. SU-F-T-342: Dosimetric Constraint Prediction Guided Automatic Mulit-Objective Optimization for Intensity Modulated Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Song, T; Zhou, L [Southern Medical University, Guangzhou, Guangdong (China); Li, Y [Beihang University, Beijing, Beijing (China)

    2016-06-15

    Purpose: For intensity modulated radiotherapy, the plan optimization is time consuming with difficulties of selecting objectives and constraints, and their relative weights. A fast and automatic multi-objective optimization algorithm with abilities to predict optimal constraints and manager their trade-offs can help to solve this problem. Our purpose is to develop such a framework and algorithm for a general inverse planning. Methods: There are three main components contained in this proposed multi-objective optimization framework: prediction of initial dosimetric constraints, further adjustment of constraints and plan optimization. We firstly use our previously developed in-house geometry-dosimetry correlation model to predict the optimal patient-specific dosimetric endpoints, and treat them as initial dosimetric constraints. Secondly, we build an endpoint(organ) priority list and a constraint adjustment rule to repeatedly tune these constraints from their initial values, until every single endpoint has no room for further improvement. Lastly, we implement a voxel-independent based FMO algorithm for optimization. During the optimization, a model for tuning these voxel weighting factors respecting to constraints is created. For framework and algorithm evaluation, we randomly selected 20 IMRT prostate cases from the clinic and compared them with our automatic generated plans, in both the efficiency and plan quality. Results: For each evaluated plan, the proposed multi-objective framework could run fluently and automatically. The voxel weighting factor iteration time varied from 10 to 30 under an updated constraint, and the constraint tuning time varied from 20 to 30 for every case until no more stricter constraint is allowed. The average total costing time for the whole optimization procedure is ∼30mins. By comparing the DVHs, better OAR dose sparing could be observed in automatic generated plan, for 13 out of the 20 cases, while others are with competitive

  1. Predicting new-onset of postoperative atrial fibrillation in patients undergoing cardiac surgery using semi-automatic reading of perioperative electrocardiograms

    DEFF Research Database (Denmark)

    Gu, Jiwei; Graff, Claus; Melgaard, Jacob

    2015-01-01

    P10 Predicting new-onset of postoperative atrial fibrillation in patients undergoingcardiac surgery using semi-automatic reading of perioperative electrocardiograms. Jiwei Gu, Claus Graff, Jacob Melgaard, Søren Lundbye-Christensen, Erik Berg Schmidt, Christian Torp-Pedersen, Kristinn Thorsteinsson......, Jan Jesper Andreasen. Aalborg, DenmarkBackground: Postoperative new onset atrial fibrillation (POAF) is the most common arrhythmia after cardiac surgery. The aim of this study was to evaluate if semi-automatic readings of perioperative electrocardiograms (ECGs) is of any value in predicting POAF after...... ECG monitoring. A semi-automatic machine capable of reading differentparameters of digitalized ECG’s was used to read both lead specific (P/QRS/T amplitudes/intervals) and global measurements (P-duration/QRS-duration/PR-interval/QT/Heart Rate/hypertrophy).Results: We divided the patients into two...

  2. An Automatic Prediction of Epileptic Seizures Using Cloud Computing and Wireless Sensor Networks.

    Science.gov (United States)

    Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar

    2016-11-01

    Epilepsy is one of the most common neurological disorders which is characterized by the spontaneous and unforeseeable occurrence of seizures. An automatic prediction of seizure can protect the patients from accidents and save their life. In this article, we proposed a mobile-based framework that automatically predict seizures using the information contained in electroencephalography (EEG) signals. The wireless sensor technology is used to capture the EEG signals of patients. The cloud-based services are used to collect and analyze the EEG data from the patient's mobile phone. The features from the EEG signal are extracted using the fast Walsh-Hadamard transform (FWHT). The Higher Order Spectral Analysis (HOSA) is applied to FWHT coefficients in order to select the features set relevant to normal, preictal and ictal states of seizure. We subsequently exploit the selected features as input to a k-means classifier to detect epileptic seizure states in a reasonable time. The performance of the proposed model is tested on Amazon EC2 cloud and compared in terms of execution time and accuracy. The findings show that with selected HOS based features, we were able to achieve a classification accuracy of 94.6 %.

  3. CERAPP: Collaborative estrogen receptor activity prediction project

    DEFF Research Database (Denmark)

    Mansouri, Kamel; Abdelaziz, Ahmed; Rybacka, Aleksandra

    2016-01-01

    ). Risk assessors need tools to prioritize chemicals for evaluation in costly in vivo tests, for instance, within the U.S. EPA Endocrine Disruptor Screening Program. oBjectives: We describe a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project...... States and Europe to predict ER activity of a common set of 32,464 chemical structures. Quantitative structure-activity relationship models and docking approaches were employed, mostly using a common training set of 1,677 chemical structures provided by the U.S. EPA, to build a total of 40 categorical......: Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. Out of the 32,464 chemicals, the consensus model predicted 4,001 chemicals (12.3%) as high priority actives and 6,742 potential actives (20.8%) to be considered for further testing. conclusion: This project demonstrated...

  4. Gene prediction using the Self-Organizing Map: automatic generation of multiple gene models.

    Science.gov (United States)

    Mahony, Shaun; McInerney, James O; Smith, Terry J; Golden, Aaron

    2004-03-05

    Many current gene prediction methods use only one model to represent protein-coding regions in a genome, and so are less likely to predict the location of genes that have an atypical sequence composition. It is likely that future improvements in gene finding will involve the development of methods that can adequately deal with intra-genomic compositional variation. This work explores a new approach to gene-prediction, based on the Self-Organizing Map, which has the ability to automatically identify multiple gene models within a genome. The current implementation, named RescueNet, uses relative synonymous codon usage as the indicator of protein-coding potential. While its raw accuracy rate can be less than other methods, RescueNet consistently identifies some genes that other methods do not, and should therefore be of interest to gene-prediction software developers and genome annotation teams alike. RescueNet is recommended for use in conjunction with, or as a complement to, other gene prediction methods.

  5. The PredictAD project

    DEFF Research Database (Denmark)

    Antila, Kari; Lötjönen, Jyrki; Thurfjell, Lennart

    2013-01-01

    Alzheimer's disease (AD) is the most common cause of dementia affecting 36 million people worldwide. As the demographic transition in the developed countries progresses towards older population, the worsening ratio of workers per retirees and the growing number of patients with age-related illnes...... candidates and implement the framework in software. The results are currently used in several research projects, licensed to commercial use and being tested for clinical use in several trials....... objective of the PredictAD project was to find and integrate efficient biomarkers from heterogeneous patient data to make early diagnosis and to monitor the progress of AD in a more efficient, reliable and objective manner. The project focused on discovering biomarkers from biomolecular data...

  6. Automatic Knowledge Extraction and Knowledge Structuring for a National Term Bank

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2011-01-01

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  7. Automatic Code Checking Applied to Fire Fighting and Panic Projects in a BIM Environment - BIMSCIP

    Directory of Open Access Journals (Sweden)

    Marcelo Franco Porto

    2017-06-01

    Full Text Available This work presents a computational implementation of an automatic conformity verification of building projects using a 3D modeling platform for BIM. This program was developed in C# language and based itself on the 9th Technical Instruction from Military Fire Brigade of the State of Minas Gerais which covers regulations of fire load in buildings and hazardous areas.

  8. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  9. The Predictive Validity of Projective Measures.

    Science.gov (United States)

    Suinn, Richard M.; Oskamp, Stuart

    Written for use by clinical practitioners as well as psychological researchers, this book surveys recent literature (1950-1965) on projective test validity by reviewing and critically evaluating studies which shed light on what may reliably be predicted from projective test results. Two major instruments are covered: the Rorschach and the Thematic…

  10. NERI PROJECT 99-119. TASK 2. DATA-DRIVEN PREDICTION OF PROCESS VARIABLES. FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.

    2003-04-10

    This report describes the detailed results for task 2 of DOE-NERI project number 99-119 entitled ''Automatic Development of Highly Reliable Control Architecture for Future Nuclear Power Plants''. This project is a collaboration effort between the Oak Ridge National Laboratory (ORNL,) The University of Tennessee, Knoxville (UTK) and the North Carolina State University (NCSU). UTK is the lead organization for Task 2 under contract number DE-FG03-99SF21906. Under task 2 we completed the development of data-driven models for the characterization of sub-system dynamics for predicting state variables, control functions, and expected control actions. We have also developed the ''Principal Component Analysis (PCA)'' approach for mapping system measurements, and a nonlinear system modeling approach called the ''Group Method of Data Handling (GMDH)'' with rational functions, and includes temporal data information for transient characterization. The majority of the results are presented in detailed reports for Phases 1 through 3 of our research, which are attached to this report.

  11. Bottlenecks in Software Defect Prediction Implementation in Industrial Projects

    OpenAIRE

    Hryszko Jarosław; Madeyski Lech

    2015-01-01

    Case studies focused on software defect prediction in real, industrial software development projects are extremely rare. We report on dedicated R&D project established in cooperation between Wroclaw University of Technology and one of the leading automotive software development companies to research possibilities of introduction of software defect prediction using an open source, extensible software measurement and defect prediction framework called DePress (Defect Prediction in Software Syst...

  12. Automatic evidence quality prediction to support evidence-based decision making.

    Science.gov (United States)

    Sarker, Abeed; Mollá, Diego; Paris, Cécile

    2015-06-01

    Evidence-based medicine practice requires practitioners to obtain the best available medical evidence, and appraise the quality of the evidence when making clinical decisions. Primarily due to the plethora of electronically available data from the medical literature, the manual appraisal of the quality of evidence is a time-consuming process. We present a fully automatic approach for predicting the quality of medical evidence in order to aid practitioners at point-of-care. Our approach extracts relevant information from medical article abstracts and utilises data from a specialised corpus to apply supervised machine learning for the prediction of the quality grades. Following an in-depth analysis of the usefulness of features (e.g., publication types of articles), they are extracted from the text via rule-based approaches and from the meta-data associated with the articles, and then applied in the supervised classification model. We propose the use of a highly scalable and portable approach using a sequence of high precision classifiers, and introduce a simple evaluation metric called average error distance (AED) that simplifies the comparison of systems. We also perform elaborate human evaluations to compare the performance of our system against human judgments. We test and evaluate our approaches on a publicly available, specialised, annotated corpus containing 1132 evidence-based recommendations. Our rule-based approach performs exceptionally well at the automatic extraction of publication types of articles, with F-scores of up to 0.99 for high-quality publication types. For evidence quality classification, our approach obtains an accuracy of 63.84% and an AED of 0.271. The human evaluations show that the performance of our system, in terms of AED and accuracy, is comparable to the performance of humans on the same data. The experiments suggest that our structured text classification framework achieves evaluation results comparable to those of human performance

  13. On the Relationship Between Automatic Attitudes and Self-Reported Sexual Assault in Men

    Science.gov (United States)

    Widman, Laura; Olson, Michael

    2013-01-01

    Research and theory suggest rape supportive attitudes are important predictors of sexual assault; yet, to date, rape supportive attitudes have been assessed exclusively through self-report measures that are methodologically and theoretically limited. To address these limitations, the objectives of the current project were to: (1) develop a novel implicit rape attitude assessment that captures automatic attitudes about rape and does not rely on self-reports, and (2) examine the association between automatic rape attitudes and sexual assault perpetration. We predicted that automatic rape attitudes would be a significant unique predictor of sexual assault even when self-reported rape attitudes (i.e., rape myth acceptance and hostility toward women) were controlled. We tested the generalizability of this prediction in two independent samples: a sample of undergraduate college men (n = 75, M age = 19.3 years) and a sample of men from the community (n = 50, M age = 35.9 years). We found the novel implicit rape attitude assessment was significantly associated with the frequency of sexual assault perpetration in both samples and contributed unique variance in explaining sexual assault beyond rape myth acceptance and hostility toward women. We discuss the ways in which future research on automatic rape attitudes may significantly advance measurement and theory aimed at understanding and preventing sexual assault. PMID:22618119

  14. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    DEFF Research Database (Denmark)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity ...

  15. Research on cross - Project software defect prediction based on transfer learning

    Science.gov (United States)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  16. Automatic prediction of facial trait judgments: appearance vs. structural models.

    Directory of Open Access Journals (Sweden)

    Mario Rojas

    Full Text Available Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a derive a facial trait judgment model from training data and b predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations and classification rules (4 rules suggest that a prediction of perception of facial traits is learnable by both holistic and structural approaches; b the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.

  17. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    NARCIS (Netherlands)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Alhusseini, Tamera I; Bedford, Felicity E; Bennett, Dominic J; Booth, Hollie; Burton, Victoria J; Chng, Charlotte W T; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Emerson, Susan R; Gao, Di; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; Pask-Hale, Gwilym D; Pynegar, Edwin L; Robinson, Alexandra N; Sanchez-Ortiz, Katia; Senior, Rebecca A; Simmons, Benno I; White, Hannah J; Zhang, Hanbin; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Albertos, Belén; Alcala, E L; Del Mar Alguacil, Maria; Alignier, Audrey; Ancrenaz, Marc; Andersen, Alan N; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Arroyo-Rodríguez, Víctor; Aumann, Tom; Axmacher, Jan C; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Bakayoko, Adama; Báldi, András; Banks, John E; Baral, Sharad K; Barlow, Jos; Barratt, Barbara I P; Barrico, Lurdes; Bartolommei, Paola; Barton, Diane M; Basset, Yves; Batáry, Péter; Bates, Adam J; Baur, Bruno; Bayne, Erin M; Beja, Pedro; Benedick, Suzan; Berg, Åke; Bernard, Henry; Berry, Nicholas J; Bhatt, Dinesh; Bicknell, Jake E; Bihn, Jochen H; Blake, Robin J; Bobo, Kadiri S; Bóçon, Roberto; Boekhout, Teun; Böhning-Gaese, Katrin; Bonham, Kevin J; Borges, Paulo A V; Borges, Sérgio H; Boutin, Céline; Bouyer, Jérémy; Bragagnolo, Cibele; Brandt, Jodi S; Brearley, Francis Q; Brito, Isabel; Bros, Vicenç; Brunet, Jörg; Buczkowski, Grzegorz; Buddle, Christopher M; Bugter, Rob; Buscardo, Erika; Buse, Jörn; Cabra-García, Jimmy; Cáceres, Nilton C; Cagle, Nicolette L; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Caparrós, Rut; Cardoso, Pedro; Carpenter, Dan; Carrijo, Tiago F; Carvalho, Anelena L; Cassano, Camila R; Castro, Helena; Castro-Luna, Alejandro A; Rolando, Cerda B; Cerezo, Alexis; Chapman, Kim Alan; Chauvat, Matthieu; Christensen, Morten; Clarke, Francis M; Cleary, Daniel F R; Colombo, Giorgio; Connop, Stuart P; Craig, Michael D; Cruz-López, Leopoldo; Cunningham, Saul A; D'Aniello, Biagio; D'Cruze, Neil; da Silva, Pedro Giovâni; Dallimer, Martin; Danquah, Emmanuel; Darvill, Ben; Dauber, Jens; Davis, Adrian L V; Dawson, Jeff; de Sassi, Claudio; de Thoisy, Benoit; Deheuvels, Olivier; Dejean, Alain; Devineau, Jean-Louis; Diekötter, Tim; Dolia, Jignasu V; Domínguez, Erwin; Dominguez-Haydar, Yamileth; Dorn, Silvia; Draper, Isabel; Dreber, Niels; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Eggleton, Paul; Eigenbrod, Felix; Elek, Zoltán; Entling, Martin H; Esler, Karen J; de Lima, Ricardo F; Faruk, Aisyah; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Fensham, Roderick J; Fernandez, Ignacio C; Ferreira, Catarina C; Ficetola, Gentile F; Fiera, Cristina; Filgueiras, Bruno K C; Fırıncıoğlu, Hüseyin K; Flaspohler, David; Floren, Andreas; Fonte, Steven J; Fournier, Anne; Fowler, Robert E; Franzén, Markus; Fraser, Lauchlan H; Fredriksson, Gabriella M; Freire, Geraldo B; Frizzo, Tiago L M; Fukuda, Daisuke; Furlani, Dario; Gaigher, René; Ganzhorn, Jörg U; García, Karla P; Garcia-R, Juan C; Garden, Jenni G; Garilleti, Ricardo; Ge, Bao-Ming; Gendreau-Berthiaume, Benoit; Gerard, Philippa J; Gheler-Costa, Carla; Gilbert, Benjamin; Giordani, Paolo; Giordano, Simonetta; Golodets, Carly; Gomes, Laurens G L; Gould, Rachelle K; Goulson, Dave; Gove, Aaron D; Granjon, Laurent; Grass, Ingo; Gray, Claudia L; Grogan, James; Gu, Weibin; Guardiola, Moisès; Gunawardene, Nihara R; Gutierrez, Alvaro G; Gutiérrez-Lamus, Doris L; Haarmeyer, Daniela H; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hassan, Shombe N; Hatfield, Richard G; Hawes, Joseph E; Hayward, Matt W; Hébert, Christian; Helden, Alvin J; Henden, John-André; Henschel, Philipp; Hernández, Lionel; Herrera, James P; Herrmann, Farina; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Höfer, Hubert; Hoffmann, Anke; Horgan, Finbarr G; Hornung, Elisabeth; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishida, Hiroaki; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Hernández, F Jiménez; Johnson, McKenzie F; Jolli, Virat; Jonsell, Mats; Juliani, S Nur; Jung, Thomas S; Kapoor, Vena; Kappes, Heike; Kati, Vassiliki; Katovai, Eric; Kellner, Klaus; Kessler, Michael; Kirby, Kathryn R; Kittle, Andrew M; Knight, Mairi E; Knop, Eva; Kohler, Florian; Koivula, Matti; Kolb, Annette; Kone, Mouhamadou; Kőrösi, Ádám; Krauss, Jochen; Kumar, Ajith; Kumar, Raman; Kurz, David J; Kutt, Alex S; Lachat, Thibault; Lantschner, Victoria; Lara, Francisco; Lasky, Jesse R; Latta, Steven C; Laurance, William F; Lavelle, Patrick; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Lehouck, Valérie; Lencinas, María V; Lentini, Pia E; Letcher, Susan G; Li, Qi; Litchwark, Simon A; Littlewood, Nick A; Liu, Yunhui; Lo-Man-Hung, Nancy; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Luskin, Matthew S; MacSwiney G, M Cristina; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Malone, Louise A; Malonza, Patrick K; Malumbres-Olarte, Jagoba; Mandujano, Salvador; Måren, Inger E; Marin-Spiotta, Erika; Marsh, Charles J; Marshall, E J P; Martínez, Eliana; Martínez Pastur, Guillermo; Moreno Mateos, David; Mayfield, Margaret M; Mazimpaka, Vicente; McCarthy, Jennifer L; McCarthy, Kyle P; McFrederick, Quinn S; McNamara, Sean; Medina, Nagore G; Medina, Rafael; Mena, Jose L; Mico, Estefania; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Miranda-Esquivel, Daniel R; Moir, Melinda L; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Mudri-Stojnic, Sonja; Munira, A Nur; Muoñz-Alonso, Antonio; Munyekenye, B F; Naidoo, Robin; Naithani, A; Nakagawa, Michiko; Nakamura, Akihiro; Nakashima, Yoshihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Navarro-Iriarte, Luis; Ndang'ang'a, Paul K; Neuschulz, Eike L; Ngai, Jacqueline T; Nicolas, Violaine; Nilsson, Sven G; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Norton, David A; Nöske, Nicole M; Nowakowski, A Justin; Numa, Catherine; O'Dea, Niall; O'Farrell, Patrick J; Oduro, William; Oertli, Sabine; Ofori-Boateng, Caleb; Oke, Christopher Omamoke; Oostra, Vicencio; Osgathorpe, Lynne M; Otavo, Samuel Eduardo; Page, Navendu V; Paritsis, Juan; Parra-H, Alejandro; Parry, Luke; Pe'er, Guy; Pearman, Peter B; Pelegrin, Nicolás; Pélissier, Raphaël; Peres, Carlos A; Peri, Pablo L; Persson, Anna S; Petanidou, Theodora; Peters, Marcell K; Pethiyagoda, Rohan S; Phalan, Ben; Philips, T Keith; Pillsbury, Finn C; Pincheira-Ulbrich, Jimmy; Pineda, Eduardo; Pino, Joan; Pizarro-Araya, Jaime; Plumptre, A J; Poggio, Santiago L; Politi, Natalia; Pons, Pere; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Rader, Romina; Ramesh, B R; Ramirez-Pinilla, Martha P; Ranganathan, Jai; Rasmussen, Claus; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Rey Benayas, José M; Rey-Velasco, Juan Carlos; Reynolds, Chevonne; Ribeiro, Danilo Bandini; Richards, Miriam H; Richardson, Barbara A; Richardson, Michael J; Ríos, Rodrigo Macip; Robinson, Richard; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rös, Matthias; Rosselli, Loreta; Rossiter, Stephen J; Roth, Dana S; Roulston, T'ai H; Rousseau, Laurent; Rubio, André V; Ruel, Jean-Claude; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Sam, Katerina; Samnegård, Ulrika; Santana, Joana; Santos, Xavier; Savage, Jade; Schellhorn, Nancy A; Schilthuizen, Menno; Schmiedel, Ute; Schmitt, Christine B; Schon, Nicole L; Schüepp, Christof; Schumann, Katharina; Schweiger, Oliver; Scott, Dawn M; Scott, Kenneth A; Sedlock, Jodi L; Seefeldt, Steven S; Shahabuddin, Ghazala; Shannon, Graeme; Sheil, Douglas; Sheldon, Frederick H; Shochat, Eyal; Siebert, Stefan J; Silva, Fernando A B; Simonetti, Javier A; Slade, Eleanor M; Smith, Jo; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Soto Quiroga, Grimaldo; St-Laurent, Martin-Hugues; Starzomski, Brian M; Stefanescu, Constanti; Steffan-Dewenter, Ingolf; Stouffer, Philip C; Stout, Jane C; Strauch, Ayron M; Struebig, Matthew J; Su, Zhimin; Suarez-Rubio, Marcela; Sugiura, Shinji; Summerville, Keith S; Sung, Yik-Hei; Sutrisno, Hari; Svenning, Jens-Christian; Teder, Tiit; Threlfall, Caragh G; Tiitsaar, Anu; Todd, Jacqui H; Tonietto, Rebecca K; Torre, Ignasi; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Uehara-Prado, Marcio; Urbina-Cardona, Nicolas; Vallan, Denis; Vanbergen, Adam J; Vasconcelos, Heraldo L; Vassilev, Kiril; Verboven, Hans A F; Verdasca, Maria João; Verdú, José R; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Virgilio, Massimiliano; Vu, Lien Van; Waite, Edward M; Walker, Tony R; Wang, Hua-Feng; Wang, Yanping; Watling, James I; Weller, Britta; Wells, Konstans; Westphal, Catrin; Wiafe, Edward D; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Wolters, Volkmar; Woodcock, Ben A; Wu, Jihua; Wunderle, Joseph M; Yamaura, Yuichi; Yoshikura, Satoko; Yu, Douglas W; Zaitsev, Andrey S; Zeidler, Juliane; Zou, Fasheng; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of

  18. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    Science.gov (United States)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  19. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  20. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  1. Fast, accurate, and robust automatic marker detection for motion correction based on oblique kV or MV projection image pairs

    International Nuclear Information System (INIS)

    Slagmolen, Pieter; Hermans, Jeroen; Maes, Frederik; Budiharto, Tom; Haustermans, Karin; Heuvel, Frank van den

    2010-01-01

    Purpose: A robust and accurate method that allows the automatic detection of fiducial markers in MV and kV projection image pairs is proposed. The method allows to automatically correct for inter or intrafraction motion. Methods: Intratreatment MV projection images are acquired during each of five treatment beams of prostate cancer patients with four implanted fiducial markers. The projection images are first preprocessed using a series of marker enhancing filters. 2D candidate marker locations are generated for each of the filtered projection images and 3D candidate marker locations are reconstructed by pairing candidates in subsequent projection images. The correct marker positions are retrieved in 3D by the minimization of a cost function that combines 2D image intensity and 3D geometric or shape information for the entire marker configuration simultaneously. This optimization problem is solved using dynamic programming such that the globally optimal configuration for all markers is always found. Translational interfraction and intrafraction prostate motion and the required patient repositioning is assessed from the position of the centroid of the detected markers in different MV image pairs. The method was validated on a phantom using CT as ground-truth and on clinical data sets of 16 patients using manual marker annotations as ground-truth. Results: The entire setup was confirmed to be accurate to around 1 mm by the phantom measurements. The reproducibility of the manual marker selection was less than 3.5 pixels in the MV images. In patient images, markers were correctly identified in at least 99% of the cases for anterior projection images and 96% of the cases for oblique projection images. The average marker detection accuracy was 1.4±1.8 pixels in the projection images. The centroid of all four reconstructed marker positions in 3D was positioned within 2 mm of the ground-truth position in 99.73% of all cases. Detecting four markers in a pair of MV images

  2. The DanTermBank Project

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Pram Nielsen, Louise

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  3. Automatic Adviser on stationary devices status identification and anticipated change

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  4. Changes in automatic threat processing precede and predict clinical changes with exposure-based cognitive-behavior therapy for panic disorder.

    Science.gov (United States)

    Reinecke, Andrea; Waldenmaier, Lara; Cooper, Myra J; Harmer, Catherine J

    2013-06-01

    Cognitive behavioral therapy (CBT) is an effective treatment for emotional disorders such as anxiety or depression, but the mechanisms underlying successful intervention are far from understood. Although it has been a long-held view that psychopharmacological approaches work by directly targeting automatic emotional information processing in the brain, it is usually postulated that psychological treatments affect these processes only over time, through changes in more conscious thought cycles. This study explored the role of early changes in emotional information processing in CBT action. Twenty-eight untreated patients with panic disorder were randomized to a single session of exposure-based CBT or waiting group. Emotional information processing was measured on the day after intervention with an attentional visual probe task, and clinical symptoms were assessed on the day after intervention and at 4-week follow-up. Vigilance for threat information was decreased in the treated group, compared with the waiting group, the day after intervention, before reductions in clinical symptoms. The magnitude of this early effect on threat vigilance predicted therapeutic response after 4 weeks. Cognitive behavioral therapy rapidly affects automatic processing, and these early effects are predictive of later therapeutic change. Such results suggest very fast action on automatic processes mediating threat sensitivity, and they provide an early marker of treatment response. Furthermore, these findings challenge the notion that psychological treatments work directly on conscious thought processes before automatic information processing and imply a greater similarity between early effects of pharmacological and psychological treatments for anxiety than previously thought. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    Science.gov (United States)

    Lawrence N. Hudson; Joseph Wunderle M.; And Others

    2016-01-01

    The PREDICTS project—Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)—has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to...

  6. A Domain Specific Embedded Language in C++ for Automatic Differentiation, Projection, Integration and Variational Formulations

    Directory of Open Access Journals (Sweden)

    Christophe Prud'homme

    2006-01-01

    Full Text Available In this article, we present a domain specific embedded language in C++ that can be used in various contexts such as numerical projection onto a functional space, numerical integration, variational formulations and automatic differentiation. Albeit these tools operate in different ways, the language overcomes this difficulty by decoupling expression constructions from evaluation. The language is implemented using expression templates and meta-programming techniques and uses various Boost libraries. The language is exercised on a number of non-trivial examples and a benchmark presents the performance behavior on a few test problems.

  7. PREDICTS: Projecting Responses of Ecological Diversity in Changing Terrestrial Systems

    Directory of Open Access Journals (Sweden)

    Georgina Mace

    2012-12-01

    Full Text Available The PREDICTS project (www.predicts.org.uk is a three-year NERC-funded project to model and predict at a global scale how local terrestrial diversity responds to human pressures such as land use, land cover, pollution, invasive species and infrastructure. PREDICTS is a collaboration between Imperial College London, the UNEP World Conservation Monitoring Centre, Microsoft Research Cambridge, UCL and the University of Sussex. In order to meet its aims, the project relies on extensive data describing the diversity and composition of biological communities at a local scale. Such data are collected on a vast scale through the committed efforts of field ecologists. If you have appropriate data that you would be willing to share with us, please get in touch (enquiries@predicts.org.uk. All contributions will be acknowledged appropriately and all data contributors will be included as co-authors on an open-access paper describing the database.

  8. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  9. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  11. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  12. ECLogger: Cross-Project Catch-Block Logging Prediction Using Ensemble of Classifiers

    Directory of Open Access Journals (Sweden)

    Sangeeta Lal

    2017-01-01

    Full Text Available Background: Software developers insert log statements in the source code to record program execution information. However, optimizing the number of log statements in the source code is challenging. Machine learning based within-project logging prediction tools, proposed in previous studies, may not be suitable for new or small software projects. For such software projects, we can use cross-project logging prediction. Aim: The aim of the study presented here is to investigate cross-project logging prediction methods and techniques. Method: The proposed method is ECLogger, which is a novel, ensemble-based, cross-project, catch-block logging prediction model. In the research We use 9 base classifiers were used and combined using ensemble techniques. The performance of ECLogger was evaluated on on three open-source Java projects: Tomcat, CloudStack and Hadoop. Results: ECLogger Bagging, ECLogger AverageVote, and ECLogger MajorityVote show a considerable improvement in the average Logged F-measure (LF on 3, 5, and 4 source -> target project pairs, respectively, compared to the baseline classifiers. ECLogger AverageVote performs best and shows improvements of 3.12% (average LF and 6.08% (average ACC – Accuracy. Conclusion: The classifier based on ensemble techniques, such as bagging, average vote, and majority vote outperforms the baseline classifier. Overall, the ECLogger AverageVote model performs best. The results show that the CloudStack project is more generalizable than the other projects.

  13. A Game Theoretic Approach to Cyber Attack Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Peng Liu

    2005-11-28

    The area investigated by this project is cyber attack prediction. With a focus on correlation-based prediction, current attack prediction methodologies overlook the strategic nature of cyber attack-defense scenarios. As a result, current cyber attack prediction methodologies are very limited in predicting strategic behaviors of attackers in enforcing nontrivial cyber attacks such as DDoS attacks, and may result in low accuracy in correlation-based predictions. This project develops a game theoretic framework for cyber attack prediction, where an automatic game-theory-based attack prediction method is proposed. Being able to quantitatively predict the likelihood of (sequences of) attack actions, our attack prediction methodology can predict fine-grained strategic behaviors of attackers and may greatly improve the accuracy of correlation-based prediction. To our best knowledge, this project develops the first comprehensive framework for incentive-based modeling and inference of attack intent, objectives, and strategies; and this project develops the first method that can predict fine-grained strategic behaviors of attackers. The significance of this research and the benefit to the public can be demonstrated to certain extent by (a) the severe threat of cyber attacks to the critical infrastructures of the nation, including many infrastructures overseen by the Department of Energy, (b) the importance of cyber security to critical infrastructure protection, and (c) the importance of cyber attack prediction to achieving cyber security.

  14. Predictors of Mental Health Symptoms, Automatic Thoughts, and Self-Esteem Among University Students.

    Science.gov (United States)

    Hiçdurmaz, Duygu; İnci, Figen; Karahan, Sevilay

    2017-01-01

    University youth is a risk group regarding mental health, and many mental health problems are frequent in this group. Sociodemographic factors such as level of income and familial factors such as relationship with father are reported to be associated with mental health symptoms, automatic thoughts, and self-esteem. Also, there are interrelations between mental health problems, automatic thoughts, and self-esteem. The extent of predictive effect of each of these variables on automatic thoughts, self-esteem, and mental health symptoms is not known. We aimed to determine the predictive factors of mental health symptoms, automatic thoughts, and self-esteem in university students. Participants were 530 students enrolled at a university in Turkey, during 2014-2015 academic year. Data were collected using the student information form, the Brief Symptom Inventory, the Automatic Thoughts Questionnaire, and the Rosenberg Self-Esteem Scale. Mental health symptoms, self-esteem, perception of the relationship with the father, and level of income as a student significantly predicted automatic thoughts. Automatic thoughts, mental health symptoms, participation in family decisions, and age had significant predictive effects on self-esteem. Finally, automatic thoughts, self-esteem, age, and perception of the relationship with the father had significant predictive effects on mental health symptoms. The predictive factors revealed in our study provide important information to practitioners and researchers by showing the elements that need to be screened for mental health of university students and issues that need to be included in counseling activities.

  15. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    OpenAIRE

    Hudson, L. N.; Newbold, T.; Contu, S.; Hill, S. L.; Lysenko, I.; De Palma, A.; Phillips, H. R.; Alhusseini, T. I.; Bedford, F. E.; Bennett, D. J.; Booth, H.; Burton, V. J.; Chng, C. W.; Choimes, A.; Correia, D. L.

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make free...

  16. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  17. Increasing Prediction the Original Final Year Project of Student Using Genetic Algorithm

    Science.gov (United States)

    Saragih, Rijois Iboy Erwin; Turnip, Mardi; Sitanggang, Delima; Aritonang, Mendarissan; Harianja, Eva

    2018-04-01

    Final year project is very important forgraduation study of a student. Unfortunately, many students are not seriouslydidtheir final projects. Many of studentsask for someone to do it for them. In this paper, an application of genetic algorithms to predict the original final year project of a studentis proposed. In the simulation, the data of the final project for the last 5 years is collected. The genetic algorithm has several operators namely population, selection, crossover, and mutation. The result suggest that genetic algorithm can do better prediction than other comparable model. Experimental results of predicting showed that 70% was more accurate than the previous researched.

  18. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  19. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  20. Piloted Simulation of a Model-Predictive Automated Recovery System

    Science.gov (United States)

    Liu, James (Yuan); Litt, Jonathan; Sowers, T. Shane; Owens, A. Karl; Guo, Ten-Huei

    2014-01-01

    This presentation describes a model-predictive automatic recovery system for aircraft on the verge of a loss-of-control situation. The system determines when it must intervene to prevent an imminent accident, resulting from a poor approach. It estimates the altitude loss that would result from a go-around maneuver at the current flight condition. If the loss is projected to violate a minimum altitude threshold, the maneuver is automatically triggered. The system deactivates to allow landing once several criteria are met. Piloted flight simulator evaluation showed the system to provide effective envelope protection during extremely unsafe landing attempts. The results demonstrate how flight and propulsion control can be integrated to recover control of the vehicle automatically and prevent a potential catastrophe.

  1. Developing a stochastic traffic volume prediction model for public-private partnership projects

    Science.gov (United States)

    Phong, Nguyen Thanh; Likhitruangsilp, Veerasak; Onishi, Masamitsu

    2017-11-01

    Transportation projects require an enormous amount of capital investment resulting from their tremendous size, complexity, and risk. Due to the limitation of public finances, the private sector is invited to participate in transportation project development. The private sector can entirely or partially invest in transportation projects in the form of Public-Private Partnership (PPP) scheme, which has been an attractive option for several developing countries, including Vietnam. There are many factors affecting the success of PPP projects. The accurate prediction of traffic volume is considered one of the key success factors of PPP transportation projects. However, only few research works investigated how to predict traffic volume over a long period of time. Moreover, conventional traffic volume forecasting methods are usually based on deterministic models which predict a single value of traffic volume but do not consider risk and uncertainty. This knowledge gap makes it difficult for concessionaires to estimate PPP transportation project revenues accurately. The objective of this paper is to develop a probabilistic traffic volume prediction model. First, traffic volumes were estimated following the Geometric Brownian Motion (GBM) process. Monte Carlo technique is then applied to simulate different scenarios. The results show that this stochastic approach can systematically analyze variations in the traffic volume and yield more reliable estimates for PPP projects.

  2. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    Science.gov (United States)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  3. Predicting Software Projects Cost Estimation Based on Mining Historical Data

    OpenAIRE

    Najadat, Hassan; Alsmadi, Izzat; Shboul, Yazan

    2012-01-01

    In this research, a hybrid cost estimation model is proposed to produce a realistic prediction model that takes into consideration software project, product, process, and environmental elements. A cost estimation dataset is built from a large number of open source projects. Those projects are divided into three domains: communication, finance, and game projects. Several data mining techniques are used to classify software projects in terms of their development complexity. Data mining techniqu...

  4. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  5. Usefulness of semi-automatic volumetry compared to established linear measurements in predicting lymph node metastases in MSCT

    Energy Technology Data Exchange (ETDEWEB)

    Buerke, Boris; Puesken, Michael; Heindel, Walter; Wessling, Johannes (Dept. of Clinical Radiology, Univ. of Muenster (Germany)), email: buerkeb@uni-muenster.de; Gerss, Joachim (Dept. of Medical Informatics and Biomathematics, Univ. of Muenster (Germany)); Weckesser, Matthias (Dept. of Nuclear Medicine, Univ. of Muenster (Germany))

    2011-06-15

    Background Volumetry of lymph nodes potentially better reflect asymmetric size alterations independently of lymph node orientation in comparison to metric parameters (e.g. long-axis diameter). Purpose To distinguish between benign and malignant lymph nodes by comparing 2D and semi-automatic 3D measurements in MSCT. Material and Methods FDG-18 PET-CT was performed in 33 patients prior to therapy for malignant melanoma at stage III/IV. One hundred and eighty-six cervico-axillary, abdominal and inguinal lymph nodes were evaluated independently by two radiologists, both manually and with the use of semi-automatic segmentation software. Long axis (LAD), short axis (SAD), maximal 3D diameter, volume and elongation were obtained. PET-CT, PET-CT follow-up and/or histology served as a combined reference standard. Statistics encompassed intra-class correlation coefficients and ROC curves. Results Compared to manual assessment, semi-automatic inter-observer variability was found to be lower, e.g. at 2.4% (95% CI 0.05-4.8) for LAD. The standard of reference revealed metastases in 90 (48%) of 186 lymph nodes. Semi-automatic prediction of lymph node metastases revealed highest areas under the ROC curves for volume (reader 1 0.77, 95%CI 0.64-0.90; reader 2 0.76, 95%CI 0.59-0.86) and SAD (reader 1 0.76, 95%CI 0.64-0.88; reader 2 0.75, 95%CI 0.62-0.89). The findings for LAD (reader 1 0.73, 95%CI 0.60-0.86; reader 2 0.71, 95%CI 0.71, 95%CI 0.57-0.85) and maximal 3D diameter (reader 1 0.70, 95%CI 0.53-0.86; reader 2 0.76, 95%CI 0.50-0.80) were found substantially lower and for elongation (reader 1 0.65, 95%CI 0.50-0.79; reader 2 0.66, 95%CI 0.52-0.81) significantly lower (p < 0.05). Conclusion Semi-automatic analysis of lymph nodes in malignant melanoma is supported by high segmentation quality and reproducibility. As compared to established SAD, semi-automatic lymph node volumetry does not have an additive role for categorizing lymph nodes as normal or metastatic in malignant

  6. Automatic assignment of prokaryotic genes to functional categories using literature profiling.

    Directory of Open Access Journals (Sweden)

    Raul Torrieri

    Full Text Available In the last years, there was an exponential increase in the number of publicly available genomes. Once finished, most genome projects lack financial support to review annotations. A few of these gene annotations are based on a combination of bioinformatics evidence, however, in most cases, annotations are based solely on sequence similarity to a previously known gene, which was most probably annotated in the same way. As a result, a large number of predicted genes remain unassigned to any functional category despite the fact that there is enough evidence in the literature to predict their function. We developed a classifier trained with term-frequency vectors automatically disclosed from text corpora of an ensemble of genes representative of each functional category of the J. Craig Venter Institute Comprehensive Microbial Resource (JCVI-CMR ontology. The classifier achieved up to 84% precision with 68% recall (for confidence≥0.4, F-measure 0.76 (recall and precision equally weighted in an independent set of 2,220 genes, from 13 bacterial species, previously classified by JCVI-CMR into unambiguous categories of its ontology. Finally, the classifier assigned (confidence≥0.7 to functional categories a total of 5,235 out of the ∼24 thousand genes previously in categories "Unknown function" or "Unclassified" for which there is literature in MEDLINE. Two biologists reviewed the literature of 100 of these genes, randomly picket, and assigned them to the same functional categories predicted by the automatic classifier. Our results confirmed the hypothesis that it is possible to confidently assign genes of a real world repository to functional categories, based exclusively on the automatic profiling of its associated literature. The LitProf--Gene Classifier web server is accessible at: www.cebio.org/litprofGC.

  7. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  8. US Climate Variability and Predictability Project

    Energy Technology Data Exchange (ETDEWEB)

    Patterson, Mike [University Corporation for Atmospheric Research (UCAR), Boulder, CO (United States)

    2017-11-14

    The US CLIVAR Project Office administers the US CLIVAR Program with its mission to advance understanding and prediction of climate variability and change across timescales with an emphasis on the role of the ocean and its interaction with other elements of the Earth system. The Project Office promotes and facilitates scientific collaboration within the US and international climate and Earth science communities, addressing priority topics from subseasonal to centennial climate variability and change; the global energy imbalance; the ocean’s role in climate, water, and carbon cycles; climate and weather extremes; and polar climate changes. This project provides essential one-year support of the Project Office, enabling the participation of US scientists in the meetings of the US CLIVAR bodies that guide scientific planning and implementation, including the scientific steering committee that establishes program goals and evaluates progress of activities to address them, the science team of funded investigators studying the ocean overturning circulation in the Atlantic, and two working groups tackling the priority research topics of Arctic change influence on midlatitude climate and weather extremes and the decadal-scale widening of the tropical belt.

  9. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  10. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units

    International Nuclear Information System (INIS)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios, Martha; Baechler, Sébastien

    2015-01-01

    Purpose: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. Method: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). Results: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Conclusion: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables

  11. Exposure to violent video games increases automatic aggressiveness.

    Science.gov (United States)

    Uhlmann, Eric; Swanson, Jane

    2004-02-01

    The effects of exposure to violent video games on automatic associations with the self were investigated in a sample of 121 students. Playing the violent video game Doom led participants to associate themselves with aggressive traits and actions on the Implicit Association Test. In addition, self-reported prior exposure to violent video games predicted automatic aggressive self-concept, above and beyond self-reported aggression. Results suggest that playing violent video games can lead to the automatic learning of aggressive self-views.

  12. Lynx: Automatic Elderly Behavior Prediction in Home Telecare

    Directory of Open Access Journals (Sweden)

    Jose Manuel Lopez-Guede

    2015-01-01

    Full Text Available This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder’s daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense or if something is wrong relative to the user’s health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%.

  13. Lynx: Automatic Elderly Behavior Prediction in Home Telecare

    Science.gov (United States)

    Lopez-Guede, Jose Manuel; Moreno-Fernandez-de-Leceta, Aitor; Martinez-Garcia, Alexeiw; Graña, Manuel

    2015-01-01

    This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder's daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense) or if something is wrong relative to the user's health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%. PMID:26783514

  14. Better Metrics to Automatically Predict the Quality of a Text Summary

    Directory of Open Access Journals (Sweden)

    Judith D. Schlesinger

    2012-09-01

    Full Text Available In this paper we demonstrate a family of metrics for estimating the quality of a text summary relative to one or more human-generated summaries. The improved metrics are based on features automatically computed from the summaries to measure content and linguistic quality. The features are combined using one of three methods—robust regression, non-negative least squares, or canonical correlation, an eigenvalue method. The new metrics significantly outperform the previous standard for automatic text summarization evaluation, ROUGE.

  15. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  16. Automatic map generalisation from research to production

    Science.gov (United States)

    Nyberg, Rose; Johansson, Mikael; Zhang, Yang

    2018-05-01

    The manual work of map generalisation is known to be a complex and time consuming task. With the development of technology and societies, the demands for more flexible map products with higher quality are growing. The Swedish mapping, cadastral and land registration authority Lantmäteriet has manual production lines for databases in five different scales, 1 : 10 000 (SE10), 1 : 50 000 (SE50), 1 : 100 000 (SE100), 1 : 250 000 (SE250) and 1 : 1 million (SE1M). To streamline this work, Lantmäteriet started a project to automatically generalise geographic information. Planned timespan for the project is 2015-2022. Below the project background together with the methods for the automatic generalisation are described. The paper is completed with a description of results and conclusions.

  17. Acquisition of automatic imitation is sensitive to sensorimotor contingency.

    Science.gov (United States)

    Cook, Richard; Press, Clare; Dickinson, Anthony; Heyes, Cecilia

    2010-08-01

    The associative sequence learning model proposes that the development of the mirror system depends on the same mechanisms of associative learning that mediate Pavlovian and instrumental conditioning. To test this model, two experiments used the reduction of automatic imitation through incompatible sensorimotor training to assess whether mirror system plasticity is sensitive to contingency (i.e., the extent to which activation of one representation predicts activation of another). In Experiment 1, residual automatic imitation was measured following incompatible training in which the action stimulus was a perfect predictor of the response (contingent) or not at all predictive of the response (noncontingent). A contingency effect was observed: There was less automatic imitation indicative of more learning in the contingent group. Experiment 2 replicated this contingency effect and showed that, as predicted by associative learning theory, it can be abolished by signaling trials in which the response occurs in the absence of an action stimulus. These findings support the view that mirror system development depends on associative learning and indicate that this learning is not purely Hebbian. If this is correct, associative learning theory could be used to explain, predict, and intervene in mirror system development.

  18. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Science.gov (United States)

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2015-01-01

    Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243), the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness) and with disorders of social cognition (autistic-like and schizotypal traits). Further personality traits (narcissism and empathy) were assessed in a subsample of participants (N=57). Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  19. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Directory of Open Access Journals (Sweden)

    Emily E Butler

    Full Text Available Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243, the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness and with disorders of social cognition (autistic-like and schizotypal traits. Further personality traits (narcissism and empathy were assessed in a subsample of participants (N=57. Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  20. Experiences in automatic keywording of particle physics literature

    CERN Document Server

    Montejo Ráez, Arturo

    2001-01-01

    Attributing keywords can assist in the classification and retrieval of documents in the particle physics literature. As information services face a future with less available manpower and more and more documents being written, the possibility of keyword attribution being assisted by automatic classification software is explored. A project being carried out at CERN (the European Laboratory for Particle Physics) for the development and integration of automatic keywording is described.

  1. Geospatial application of the Water Erosion Prediction Project (WEPP) Model

    Science.gov (United States)

    D. C. Flanagan; J. R. Frankenberger; T. A. Cochrane; C. S. Renschler; W. J. Elliot

    2011-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based technology for prediction of soil erosion by water at hillslope profile, field, and small watershed scales. In particular, WEPP utilizes observed or generated daily climate inputs to drive the surface hydrology processes (infiltration, runoff, ET) component, which subsequently impacts the rest of the...

  2. Feature Subset Selection and Instance Filtering for Cross-project Defect Prediction - Classification and Ranking

    Directory of Open Access Journals (Sweden)

    Faimison Porto

    2016-12-01

    Full Text Available The defect prediction models can be a good tool on organizing the project's test resources. The models can be constructed with two main goals: 1 to classify the software parts - defective or not; or 2 to rank the most defective parts in a decreasing order. However, not all companies maintain an appropriate set of historical defect data. In this case, a company can build an appropriate dataset from known external projects - called Cross-project Defect Prediction (CPDP. The CPDP models, however, present low prediction performances due to the heterogeneity of data. Recently, Instance Filtering methods were proposed in order to reduce this heterogeneity by selecting the most similar instances from the training dataset. Originally, the similarity is calculated based on all the available dataset features (or independent variables. We propose that using only the most relevant features on the similarity calculation can result in more accurate filtered datasets and better prediction performances. In this study we extend our previous work. We analyse both prediction goals - Classification and Ranking. We present an empirical evaluation of 41 different methods by associating Instance Filtering methods with Feature Selection methods. We used 36 versions of 11 open source projects on experiments. The results show similar evidences for both prediction goals. First, the defect prediction performance of CPDP models can be improved by associating Feature Selection and Instance Filtering. Second, no evaluated method presented general better performances. Indeed, the most appropriate method can vary according to the characteristics of the project being predicted.

  3. Automatic content linking: Speech-based just-in-time retrieval for multimedia archives

    NARCIS (Netherlands)

    Popescu-Belis, A.; Kilgour, J.; Poller, P.; Nanchen, A.; Boertjes, E.; Wit, J. de

    2010-01-01

    The Automatic Content Linking Device monitors a conversation and uses automatically recognized words to retrieve documents that are of potential use to the participants. The document set includes project related reports or emails, transcribed snippets of past meetings, and websites. Retrieval

  4. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm

    Science.gov (United States)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-02-01

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  5. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    OpenAIRE

    Hudson, LN; Newbold, T; Contu, S; Hill, SLL; Lysenko, I; De Palma, A; Phillips, HRP; Alhusseini, TI; Bedford, FE; Bennett, DJ; Booth, H; Burton, VJ; Chng, CWT; Choimes, A; Correia, DLP

    2017-01-01

    The PREDICTS project—Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)—has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make free...

  6. Development of new geoinformation methods for modelling and prediction of sea level change over different timescales - overview of the project

    Science.gov (United States)

    Niedzielski, T.; Włosińska, M.; Miziński, B.; Hewelt, M.; Migoń, P.; Kosek, W.; Priede, I. G.

    2012-04-01

    The poster aims to provide a broad scientific audience with a general overview of a project on sea level change modelling and prediction that has just commenced at the University of Wrocław, Poland. The initiative that the project fits, called the Homing Plus programme, is organised by the Foundation for Polish Science and financially supported by the European Union through the European Regional Development Fund and the Innovative Economy Programme. There are two key research objectives of the project that complement each other. First, emphasis is put on modern satellite altimetric gridded time series from the Archiving, Validation and Interpretation of Satellite Oceanographic data (AVISO) repository. Daily sea level anomaly maps, access to which in near-real time is courtesy of AVISO, are being steadily downloaded every day to our local server in Wroclaw, Poland. These data will be processed within a general framework of modelling and prediction of sea level change in short, medium and long term. Secondly, sea level change over geological time is scrutinised in order to cover very long time scales that go far beyond a history of altimetric and tide-gauge measurements. The aforementioned approaches comprise a few tasks that aim to solve the following detailed problems. Within the first one, our objective is to seek spatio-temporal dependencies in the gridded sea level anomaly time series. Subsequently, predictions that make use of such cross-correlations shall be derived, and near-real time service for automatic update with validation will be implemented. Concurrently, (i.e. apart from spatio-temporal dependencies and their use in the process of forecasting variable sea level topography), threshold models shall be utilised for predicting the El Niño/Southern Oscillation (ENSO) signal that is normally present in sea level anomaly time series of the equatorial Pacific. Within the second approach, however, the entirely different methods are proposed. Links between

  7. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  8. Development of a wind farm noise propagation prediction model - project progress to date

    International Nuclear Information System (INIS)

    Robinson, P.; Bullmore, A.; Bass, J.; Sloth, E.

    1998-01-01

    This paper describes a twelve month measurement campaign which is part of a European project (CEC Project JOR3-CT95-0051) with the aim to substantially reduce the uncertainties involved in predicting environmentally radiated noise levels from wind farms (1). This will be achieved by comparing noise levels measure at varying distances from single and multiple sources over differing complexities of terrain with those predicted using a number of currently adopted sound propagation models. Specific objectives within the project are to: establish the important parameters controlling the propagation of wind farm noise to the far field; develop a planning tool for predicting wind farm noise emission levels under practically encountered conditions; place confidence limits on the upper and lower bounds of the noise levels predicted, thus enabling developers to quantify the risk whether noise emission from wind farms will cause nuisance to nearby residents. (Author)

  9. The CHilean Automatic Supernova sEarch

    DEFF Research Database (Denmark)

    Hamuy, M.; Pignata, G.; Maza, J.

    2012-01-01

    The CHilean Automatic Supernova sEarch (CHASE) project began in 2007 with the goal to discover young, nearby southern supernovae in order to (1) better understand the physics of exploding stars and their progenitors, and (2) refine the methods to derive extragalactic distances. During the first...

  10. Planning Complex Projects Automatically

    Science.gov (United States)

    Henke, Andrea L.; Stottler, Richard H.; Maher, Timothy P.

    1995-01-01

    Automated Manifest Planner (AMP) computer program applies combination of artificial-intelligence techniques to assist both expert and novice planners, reducing planning time by orders of magnitude. Gives planners flexibility to modify plans and constraints easily, without need for programming expertise. Developed specifically for planning space shuttle missions 5 to 10 years ahead, with modifications, applicable in general to planning other complex projects requiring scheduling of activities depending on other activities and/or timely allocation of resources. Adaptable to variety of complex scheduling problems in manufacturing, transportation, business, architecture, and construction.

  11. The Ensembl genome database project.

    Science.gov (United States)

    Hubbard, T; Barker, D; Birney, E; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Huminiecki, L; Kasprzyk, A; Lehvaslaiho, H; Lijnzaad, P; Melsopp, C; Mongin, E; Pettett, R; Pocock, M; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Clamp, M

    2002-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of the human genome sequence, with confirmed gene predictions that have been integrated with external data sources, and is available as either an interactive web site or as flat files. It is also an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements from sequence analysis to data storage and visualisation. The Ensembl site is one of the leading sources of human genome sequence annotation and provided much of the analysis for publication by the international human genome project of the draft genome. The Ensembl system is being installed around the world in both companies and academic sites on machines ranging from supercomputers to laptops.

  12. Colour transformations and K-means segmentation for automatic cloud detection

    Directory of Open Access Journals (Sweden)

    Martin Blazek

    2015-08-01

    Full Text Available The main aim of this work is to find simple criteria for automatic recognition of several meteorological phenomena using optical digital sensors (e.g., Wide-Field cameras, automatic DSLR cameras or robotic telescopes. The output of those sensors is commonly represented in RGB channels containing information about both colour and luminosity even when normalised. Transformation into other colour spaces (e.g., CIE 1931 xyz, CIE L*a*b*, YCbCr can separate colour from luminosity, which is especially useful in the image processing of automatic cloud boundary recognition. Different colour transformations provide different sectorization of cloudy images. Hence, the analysed meteorological phenomena (cloud types, clear sky project differently into the colour diagrams of each international colour systems. In such diagrams, statistical tools can be applied in search of criteria which could determine clear sky from a covered one and possibly even perform a meteorological classification of cloud types. For the purpose of this work, a database of sky images (both clear and cloudy, with emphasis on a variety of different observation conditions (e.g., time, altitude, solar angle, etc. was acquired. The effectiveness of several colour transformations for meteorological application is discussed and the representation of different clouds (or clear sky in those colour systems is analysed. Utilisation of this algorithm would be useful in all-sky surveys, supplementary meteorological observations, solar cell effectiveness predictions or daytime astronomical solar observations.

  13. Predicting shrinkage and warpage in injection molding: Towards automatized mold design

    Science.gov (United States)

    Zwicke, Florian; Behr, Marek; Elgeti, Stefanie

    2017-10-01

    It is an inevitable part of any plastics molding process that the material undergoes some shrinkage during solidification. Mainly due to unavoidable inhomogeneities in the cooling process, the overall shrinkage cannot be assumed as homogeneous in all volumetric directions. The direct consequence is warpage. The accurate prediction of such shrinkage and warpage effects has been the subject of a considerable amount of research, but it is important to note that this behavior depends greatly on the type of material that is used as well as the process details. Without limiting ourselves to any specific properties of certain materials or process designs, we aim to develop a method for the automatized design of a mold cavity that will produce correctly shaped moldings after solidification. Essentially, this can be stated as a shape optimization problem, where the cavity shape is optimized to fulfill some objective function that measures defects in the molding shape. In order to be able to develop and evaluate such a method, we first require simulation methods for the diffierent steps involved in the injection molding process that can represent the phenomena responsible for shrinkage and warpage ina sufficiently accurate manner. As a starting point, we consider the solidification of purely amorphous materials. In this case, the material slowly transitions from fluid-like to solid-like behavior as it cools down. This behavior is modeled using adjusted viscoelastic material models. Once the material has passed a certain temperature threshold during cooling, any viscous effects are neglected and the behavior is assumed to be fully elastic. Non-linear elastic laws are used to predict shrinkage and warpage that occur after this point. We will present the current state of these simulation methods and show some first approaches towards optimizing the mold cavity shape based on these methods.

  14. Rainfall and Extratropical Transition of Tropical Cyclones: Simulation, Prediction, and Projection

    Science.gov (United States)

    Liu, Maofeng

    Rainfall and associated flood hazards are one of the major threats of tropical cyclones (TCs) to coastal and inland regions. The interaction of TCs with extratropical systems can lead to enhanced precipitation over enlarged areas through extratropical transition (ET). To achieve a comprehensive understanding of rainfall and ET associated with TCs, this thesis conducts weather-scale analyses by focusing on individual storms and climate-scale analyses by focusing on seasonal predictability and changing properties of climatology under global warming. The temporal and spatial rainfall evolution of individual storms, including Hurricane Irene (2011), Hurricane Hanna (2008), and Hurricane Sandy (2012), is explored using the Weather Research and Forecast (WRF) model and a variety of hydrometeorological datasets. ET and Orographic mechanism are two key players in the rainfall distribution of Irene over regions experiencing most severe flooding. The change of TC rainfall under global warming is explored with the Forecast-oriented Low Ocean Resolution (FLOR) climate model under representative concentration pathway (RCP) 4.5 scenario. Despite decreased TC frequency, FLOR projects increased landfalling TC rainfall over most regions of eastern United States, highlighting the risk of increased flood hazards. Increased storm rain rate is an important player of increased landfalling TC rainfall. A higher atmospheric resolution version of FLOR (HiFLOR) model projects increased TC rainfall at global scales. The increase of TC intensity and environmental water vapor content scaled by the Clausius-Clapeyron relation are two key factors that explain the projected increase of TC rainfall. Analyses on the simulation, prediction, and projection of the ET activity with FLOR are conducted in the North Atlantic. FLOR model exhibits good skills in simulating many aspects of present-day ET climatology. The 21st-century-projection under RCP4.5 scenario demonstrates the dominant role of ET

  15. A method for volumetric imaging in radiotherapy using single x-ray projection

    International Nuclear Information System (INIS)

    Xu, Yuan; Yan, Hao; Ouyang, Luo; Wang, Jing; Jiang, Steve B.; Jia, Xun; Zhou, Linghong; Cervino, Laura

    2015-01-01

    Purpose: It is an intriguing problem to generate an instantaneous volumetric image based on the corresponding x-ray projection. The purpose of this study is to develop a new method to achieve this goal via a sparse learning approach. Methods: To extract motion information hidden in projection images, the authors partitioned a projection image into small rectangular patches. The authors utilized a sparse learning method to automatically select patches that have a high correlation with principal component analysis (PCA) coefficients of a lung motion model. A model that maps the patch intensity to the PCA coefficients was built along with the patch selection process. Based on this model, a measured projection can be used to predict the PCA coefficients, which are then further used to generate a motion vector field and hence a volumetric image. The authors have also proposed an intensity baseline correction method based on the partitioned projection, in which the first and the second moments of pixel intensities at a patch in a simulated projection image are matched with those in a measured one via a linear transformation. The proposed method has been validated in both simulated data and real phantom data. Results: The algorithm is able to identify patches that contain relevant motion information such as the diaphragm region. It is found that an intensity baseline correction step is important to remove the systematic error in the motion prediction. For the simulation case, the sparse learning model reduced the prediction error for the first PCA coefficient to 5%, compared to the 10% error when sparse learning was not used, and the 95th percentile error for the predicted motion vector was reduced from 2.40 to 0.92 mm. In the phantom case with a regular tumor motion, the predicted tumor trajectory was successfully reconstructed with a 0.82 mm error for tumor center localization compared to a 1.66 mm error without using the sparse learning method. When the tumor motion

  16. Automatic and strategic measures as predictors of mirror gazing among individuals with body dysmorphic disorder symptoms.

    Science.gov (United States)

    Clerkin, Elise M; Teachman, Bethany A

    2009-08-01

    The current study tests cognitive-behavioral models of body dysmorphic disorder (BDD) by examining the relationship between cognitive biases and correlates of mirror gazing. To provide a more comprehensive picture, we investigated both relatively strategic (i.e., available for conscious introspection) and automatic (i.e., outside conscious control) measures of cognitive biases in a sample with either high (n = 32) or low (n = 31) BDD symptoms. Specifically, we examined the extent that (1) explicit interpretations tied to appearance, as well as (2) automatic associations and (3) strategic evaluations of the importance of attractiveness predict anxiety and avoidance associated with mirror gazing. Results indicated that interpretations tied to appearance uniquely predicted self-reported desire to avoid, whereas strategic evaluations of appearance uniquely predicted peak anxiety associated with mirror gazing, and automatic appearance associations uniquely predicted behavioral avoidance. These results offer considerable support for cognitive models of BDD, and suggest a dissociation between automatic and strategic measures.

  17. Automatic and Strategic Measures as Predictors of Mirror Gazing Among Individuals with Body Dysmorphic Disorder Symptoms

    Science.gov (United States)

    Clerkin, Elise M.; Teachman, Bethany A.

    2011-01-01

    The current study tests cognitive-behavioral models of body dysmorphic disorder (BDD) by examining the relationship between cognitive biases and correlates of mirror gazing. To provide a more comprehensive picture, we investigated both relatively strategic (i.e., available for conscious introspection) and automatic (i.e., outside conscious control) measures of cognitive biases in a sample with either high (n=32) or low (n=31) BDD symptoms. Specifically, we examined the extent that 1) explicit interpretations tied to appearance, as well as 2) automatic associations and 3) strategic evaluations of the importance of attractiveness predict anxiety and avoidance associated with mirror gazing. Results indicated that interpretations tied to appearance uniquely predicted self-reported desire to avoid, while strategic evaluations of appearance uniquely predicted peak anxiety associated with mirror gazing, and automatic appearance associations uniquely predicted behavioral avoidance. These results offer considerable support for cognitive models of BDD, and suggest a dissociation between automatic and strategic measures. PMID:19684496

  18. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  19. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project.

    Science.gov (United States)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Alhusseini, Tamera I; Bedford, Felicity E; Bennett, Dominic J; Booth, Hollie; Burton, Victoria J; Chng, Charlotte W T; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Emerson, Susan R; Gao, Di; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; Pask-Hale, Gwilym D; Pynegar, Edwin L; Robinson, Alexandra N; Sanchez-Ortiz, Katia; Senior, Rebecca A; Simmons, Benno I; White, Hannah J; Zhang, Hanbin; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Albertos, Belén; Alcala, E L; Del Mar Alguacil, Maria; Alignier, Audrey; Ancrenaz, Marc; Andersen, Alan N; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Arroyo-Rodríguez, Víctor; Aumann, Tom; Axmacher, Jan C; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Bakayoko, Adama; Báldi, András; Banks, John E; Baral, Sharad K; Barlow, Jos; Barratt, Barbara I P; Barrico, Lurdes; Bartolommei, Paola; Barton, Diane M; Basset, Yves; Batáry, Péter; Bates, Adam J; Baur, Bruno; Bayne, Erin M; Beja, Pedro; Benedick, Suzan; Berg, Åke; Bernard, Henry; Berry, Nicholas J; Bhatt, Dinesh; Bicknell, Jake E; Bihn, Jochen H; Blake, Robin J; Bobo, Kadiri S; Bóçon, Roberto; Boekhout, Teun; Böhning-Gaese, Katrin; Bonham, Kevin J; Borges, Paulo A V; Borges, Sérgio H; Boutin, Céline; Bouyer, Jérémy; Bragagnolo, Cibele; Brandt, Jodi S; Brearley, Francis Q; Brito, Isabel; Bros, Vicenç; Brunet, Jörg; Buczkowski, Grzegorz; Buddle, Christopher M; Bugter, Rob; Buscardo, Erika; Buse, Jörn; Cabra-García, Jimmy; Cáceres, Nilton C; Cagle, Nicolette L; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Caparrós, Rut; Cardoso, Pedro; Carpenter, Dan; Carrijo, Tiago F; Carvalho, Anelena L; Cassano, Camila R; Castro, Helena; Castro-Luna, Alejandro A; Rolando, Cerda B; Cerezo, Alexis; Chapman, Kim Alan; Chauvat, Matthieu; Christensen, Morten; Clarke, Francis M; Cleary, Daniel F R; Colombo, Giorgio; Connop, Stuart P; Craig, Michael D; Cruz-López, Leopoldo; Cunningham, Saul A; D'Aniello, Biagio; D'Cruze, Neil; da Silva, Pedro Giovâni; Dallimer, Martin; Danquah, Emmanuel; Darvill, Ben; Dauber, Jens; Davis, Adrian L V; Dawson, Jeff; de Sassi, Claudio; de Thoisy, Benoit; Deheuvels, Olivier; Dejean, Alain; Devineau, Jean-Louis; Diekötter, Tim; Dolia, Jignasu V; Domínguez, Erwin; Dominguez-Haydar, Yamileth; Dorn, Silvia; Draper, Isabel; Dreber, Niels; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Eggleton, Paul; Eigenbrod, Felix; Elek, Zoltán; Entling, Martin H; Esler, Karen J; de Lima, Ricardo F; Faruk, Aisyah; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Fensham, Roderick J; Fernandez, Ignacio C; Ferreira, Catarina C; Ficetola, Gentile F; Fiera, Cristina; Filgueiras, Bruno K C; Fırıncıoğlu, Hüseyin K; Flaspohler, David; Floren, Andreas; Fonte, Steven J; Fournier, Anne; Fowler, Robert E; Franzén, Markus; Fraser, Lauchlan H; Fredriksson, Gabriella M; Freire, Geraldo B; Frizzo, Tiago L M; Fukuda, Daisuke; Furlani, Dario; Gaigher, René; Ganzhorn, Jörg U; García, Karla P; Garcia-R, Juan C; Garden, Jenni G; Garilleti, Ricardo; Ge, Bao-Ming; Gendreau-Berthiaume, Benoit; Gerard, Philippa J; Gheler-Costa, Carla; Gilbert, Benjamin; Giordani, Paolo; Giordano, Simonetta; Golodets, Carly; Gomes, Laurens G L; Gould, Rachelle K; Goulson, Dave; Gove, Aaron D; Granjon, Laurent; Grass, Ingo; Gray, Claudia L; Grogan, James; Gu, Weibin; Guardiola, Moisès; Gunawardene, Nihara R; Gutierrez, Alvaro G; Gutiérrez-Lamus, Doris L; Haarmeyer, Daniela H; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hassan, Shombe N; Hatfield, Richard G; Hawes, Joseph E; Hayward, Matt W; Hébert, Christian; Helden, Alvin J; Henden, John-André; Henschel, Philipp; Hernández, Lionel; Herrera, James P; Herrmann, Farina; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Höfer, Hubert; Hoffmann, Anke; Horgan, Finbarr G; Hornung, Elisabeth; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishida, Hiroaki; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Hernández, F Jiménez; Johnson, McKenzie F; Jolli, Virat; Jonsell, Mats; Juliani, S Nur; Jung, Thomas S; Kapoor, Vena; Kappes, Heike; Kati, Vassiliki; Katovai, Eric; Kellner, Klaus; Kessler, Michael; Kirby, Kathryn R; Kittle, Andrew M; Knight, Mairi E; Knop, Eva; Kohler, Florian; Koivula, Matti; Kolb, Annette; Kone, Mouhamadou; Kőrösi, Ádám; Krauss, Jochen; Kumar, Ajith; Kumar, Raman; Kurz, David J; Kutt, Alex S; Lachat, Thibault; Lantschner, Victoria; Lara, Francisco; Lasky, Jesse R; Latta, Steven C; Laurance, William F; Lavelle, Patrick; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Lehouck, Valérie; Lencinas, María V; Lentini, Pia E; Letcher, Susan G; Li, Qi; Litchwark, Simon A; Littlewood, Nick A; Liu, Yunhui; Lo-Man-Hung, Nancy; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Luskin, Matthew S; MacSwiney G, M Cristina; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Malone, Louise A; Malonza, Patrick K; Malumbres-Olarte, Jagoba; Mandujano, Salvador; Måren, Inger E; Marin-Spiotta, Erika; Marsh, Charles J; Marshall, E J P; Martínez, Eliana; Martínez Pastur, Guillermo; Moreno Mateos, David; Mayfield, Margaret M; Mazimpaka, Vicente; McCarthy, Jennifer L; McCarthy, Kyle P; McFrederick, Quinn S; McNamara, Sean; Medina, Nagore G; Medina, Rafael; Mena, Jose L; Mico, Estefania; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Miranda-Esquivel, Daniel R; Moir, Melinda L; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Mudri-Stojnic, Sonja; Munira, A Nur; Muoñz-Alonso, Antonio; Munyekenye, B F; Naidoo, Robin; Naithani, A; Nakagawa, Michiko; Nakamura, Akihiro; Nakashima, Yoshihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Navarro-Iriarte, Luis; Ndang'ang'a, Paul K; Neuschulz, Eike L; Ngai, Jacqueline T; Nicolas, Violaine; Nilsson, Sven G; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Norton, David A; Nöske, Nicole M; Nowakowski, A Justin; Numa, Catherine; O'Dea, Niall; O'Farrell, Patrick J; Oduro, William; Oertli, Sabine; Ofori-Boateng, Caleb; Oke, Christopher Omamoke; Oostra, Vicencio; Osgathorpe, Lynne M; Otavo, Samuel Eduardo; Page, Navendu V; Paritsis, Juan; Parra-H, Alejandro; Parry, Luke; Pe'er, Guy; Pearman, Peter B; Pelegrin, Nicolás; Pélissier, Raphaël; Peres, Carlos A; Peri, Pablo L; Persson, Anna S; Petanidou, Theodora; Peters, Marcell K; Pethiyagoda, Rohan S; Phalan, Ben; Philips, T Keith; Pillsbury, Finn C; Pincheira-Ulbrich, Jimmy; Pineda, Eduardo; Pino, Joan; Pizarro-Araya, Jaime; Plumptre, A J; Poggio, Santiago L; Politi, Natalia; Pons, Pere; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Rader, Romina; Ramesh, B R; Ramirez-Pinilla, Martha P; Ranganathan, Jai; Rasmussen, Claus; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Rey Benayas, José M; Rey-Velasco, Juan Carlos; Reynolds, Chevonne; Ribeiro, Danilo Bandini; Richards, Miriam H; Richardson, Barbara A; Richardson, Michael J; Ríos, Rodrigo Macip; Robinson, Richard; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rös, Matthias; Rosselli, Loreta; Rossiter, Stephen J; Roth, Dana S; Roulston, T'ai H; Rousseau, Laurent; Rubio, André V; Ruel, Jean-Claude; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Sam, Katerina; Samnegård, Ulrika; Santana, Joana; Santos, Xavier; Savage, Jade; Schellhorn, Nancy A; Schilthuizen, Menno; Schmiedel, Ute; Schmitt, Christine B; Schon, Nicole L; Schüepp, Christof; Schumann, Katharina; Schweiger, Oliver; Scott, Dawn M; Scott, Kenneth A; Sedlock, Jodi L; Seefeldt, Steven S; Shahabuddin, Ghazala; Shannon, Graeme; Sheil, Douglas; Sheldon, Frederick H; Shochat, Eyal; Siebert, Stefan J; Silva, Fernando A B; Simonetti, Javier A; Slade, Eleanor M; Smith, Jo; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Soto Quiroga, Grimaldo; St-Laurent, Martin-Hugues; Starzomski, Brian M; Stefanescu, Constanti; Steffan-Dewenter, Ingolf; Stouffer, Philip C; Stout, Jane C; Strauch, Ayron M; Struebig, Matthew J; Su, Zhimin; Suarez-Rubio, Marcela; Sugiura, Shinji; Summerville, Keith S; Sung, Yik-Hei; Sutrisno, Hari; Svenning, Jens-Christian; Teder, Tiit; Threlfall, Caragh G; Tiitsaar, Anu; Todd, Jacqui H; Tonietto, Rebecca K; Torre, Ignasi; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Uehara-Prado, Marcio; Urbina-Cardona, Nicolas; Vallan, Denis; Vanbergen, Adam J; Vasconcelos, Heraldo L; Vassilev, Kiril; Verboven, Hans A F; Verdasca, Maria João; Verdú, José R; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Virgilio, Massimiliano; Vu, Lien Van; Waite, Edward M; Walker, Tony R; Wang, Hua-Feng; Wang, Yanping; Watling, James I; Weller, Britta; Wells, Konstans; Westphal, Catrin; Wiafe, Edward D; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Wolters, Volkmar; Woodcock, Ben A; Wu, Jihua; Wunderle, Joseph M; Yamaura, Yuichi; Yoshikura, Satoko; Yu, Douglas W; Zaitsev, Andrey S; Zeidler, Juliane; Zou, Fasheng; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make freely available this 2016 release of the database, containing more than 3.2 million records sampled at over 26,000 locations and representing over 47,000 species. We outline how the database can help in answering a range of questions in ecology and conservation biology. To our knowledge, this is the largest and most geographically and taxonomically representative database of spatial comparisons of biodiversity that has been collated to date; it will be useful to researchers and international efforts wishing to model and understand the global status of biodiversity.

  20. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.

    2009-01-01

    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009 http://dml.cz/handle/10338.dmlcz/140037

  1. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  2. Markov random field based automatic image alignment for electron tomography.

    Science.gov (United States)

    Amat, Fernando; Moussavi, Farshid; Comolli, Luis R; Elidan, Gal; Downing, Kenneth H; Horowitz, Mark

    2008-03-01

    We present a method for automatic full-precision alignment of the images in a tomographic tilt series. Full-precision automatic alignment of cryo electron microscopy images has remained a difficult challenge to date, due to the limited electron dose and low image contrast. These facts lead to poor signal to noise ratio (SNR) in the images, which causes automatic feature trackers to generate errors, even with high contrast gold particles as fiducial features. To enable fully automatic alignment for full-precision reconstructions, we frame the problem probabilistically as finding the most likely particle tracks given a set of noisy images, using contextual information to make the solution more robust to the noise in each image. To solve this maximum likelihood problem, we use Markov Random Fields (MRF) to establish the correspondence of features in alignment and robust optimization for projection model estimation. The resulting algorithm, called Robust Alignment and Projection Estimation for Tomographic Reconstruction, or RAPTOR, has not needed any manual intervention for the difficult datasets we have tried, and has provided sub-pixel alignment that is as good as the manual approach by an expert user. We are able to automatically map complete and partial marker trajectories and thus obtain highly accurate image alignment. Our method has been applied to challenging cryo electron tomographic datasets with low SNR from intact bacterial cells, as well as several plastic section and X-ray datasets.

  3. Automatic generation of predictive dynamic models reveals nuclear phosphorylation as the key Msn2 control mechanism.

    Science.gov (United States)

    Sunnåker, Mikael; Zamora-Sillero, Elias; Dechant, Reinhard; Ludwig, Christina; Busetto, Alberto Giovanni; Wagner, Andreas; Stelling, Joerg

    2013-05-28

    Predictive dynamical models are critical for the analysis of complex biological systems. However, methods to systematically develop and discriminate among systems biology models are still lacking. We describe a computational method that incorporates all hypothetical mechanisms about the architecture of a biological system into a single model and automatically generates a set of simpler models compatible with observational data. As a proof of principle, we analyzed the dynamic control of the transcription factor Msn2 in Saccharomyces cerevisiae, specifically the short-term mechanisms mediating the cells' recovery after release from starvation stress. Our method determined that 12 of 192 possible models were compatible with available Msn2 localization data. Iterations between model predictions and rationally designed phosphoproteomics and imaging experiments identified a single-circuit topology with a relative probability of 99% among the 192 models. Model analysis revealed that the coupling of dynamic phenomena in Msn2 phosphorylation and transport could lead to efficient stress response signaling by establishing a rate-of-change sensor. Similar principles could apply to mammalian stress response pathways. Systematic construction of dynamic models may yield detailed insight into nonobvious molecular mechanisms.

  4. PLC Based Automatic Multistoried Car Parking System

    OpenAIRE

    Swanand S .Vaze; Rohan S. Mithari

    2014-01-01

    This project work presents the study and design of PLC based Automatic Multistoried Car Parking System. Multistoried car parking is an arrangement which is used to park a large number of vehicles in least possible place. For making this arrangement in a real plan very high technological instruments are required. In this project a prototype of such a model is made. This prototype model is made for accommodating twelve cars at a time. Availability of the space for parking is detecte...

  5. Automatic pitch detection for a computer game interface

    International Nuclear Information System (INIS)

    Fonseca Solis, Juan M.

    2015-01-01

    A software able to recognize notes played by musical instruments is created through automatic pitch recognition. A pitch recognition algorithm is embedded into a software project, using the C implementation of SWIPEP. A memory game is chosen for project. A sequence of notes is listened and played by user to the computer, using a soprano recorder flute. The basic concepts to understand the acoustic phenomena involved are explained. The paper is aimed for all students with basic programming knowledge and want to incorporate sound processing to their projects. (author) [es

  6. Commercial effectiveness assessment of implementation the energy efficiency raising of the building project due to introduction of automatic heat consumption control

    Directory of Open Access Journals (Sweden)

    Zvonareva Yu.N.

    2017-01-01

    Full Text Available Introduction of the automated metering stations and regulation (AUU located directly in the heated building besides creation of comfortable conditions indoors leads to decrease in consumption of thermal energy. The annual expected effect of realization of the offered actions (installation of metering stations and automatic control can make up to 22% consumed and that isn–t less important, the paid thermal energy. In general, efficiency of implementation of the project on introduction of AUU can be characterized by considerable decrease in heat consumption of the building and, respectively, reduction of a payment for the consumed energy resources. In this paper we evaluated the effectiveness of implementation of increase of energy efficiency of the building investment project (hereinafter SP. We calculated the ratio of expenses and the results considered actions for inhabitants of an apartment house located in Kazan after installation of a weather-dependent regulation. As a result of calculation of the imitating model created on the basis of basic data and the investment project plan the main results of determination of economic efficiency of the Project have been received. For the analysis and increase of reliability of a settlement assessment of efficiency of the investment project calculations at different options of a set of basic data are executed.

  7. Detecting accuracy of flaws by manual and automatic ultrasonic inspections

    International Nuclear Information System (INIS)

    Iida, K.

    1988-01-01

    As the final stage work in the nine year project on proving tests of the ultrasonic inspection technique applied to the ISI of LWR plants, automatic ultrasonic inspection tests were carried out on EDM notches, surface fatigue cracks, weld defects and stress corrosion cracks, which were deliberately introduced in full size structural components simulating a 1,100 MWe BWR. Investigated items are the performance of a newly assembled automatic inspection apparatus, detection limit of flaws, detection resolution of adjacent collinear or parallel EDM notches, detection reproducibility and detection accuracy. The manual ultrasonic inspection of the same flaws as inspected by the automatic ultrasonic inspection was also carried out in order to have comparative data. This paper reports how it was confirmed that the automatic ultrasonic inspection is much superior to the manual inspection in the flaw detection rate and in the detection reproducibility

  8. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database...

  9. Design and construction of an automatic texture goniometer

    International Nuclear Information System (INIS)

    Lima, N.B. de; Pontes, E.W.; Monteiro, P.R.B.; Imakuma, K.

    1984-01-01

    The project and construction of a two-axis automatic goniometer, operated by step motors, adaptable to the scanning goniometer SG-7 or SG-8 fabricated by Rigaku-Deuki. To allow the operation of this texture goniometer, computer codes has been developed. (E.G.) [pt

  10. Investigation of an automatic trim algorithm for restructurable aircraft control

    Science.gov (United States)

    Weiss, J.; Eterno, J.; Grunberg, D.; Looze, D.; Ostroff, A.

    1986-01-01

    This paper develops and solves an automatic trim problem for restructurable aircraft control. The trim solution is applied as a feed-forward control to reject measurable disturbances following control element failures. Disturbance rejection and command following performances are recovered through the automatic feedback control redesign procedure described by Looze et al. (1985). For this project the existence of a failure detection mechanism is assumed, and methods to cope with potential detection and identification inaccuracies are addressed.

  11. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.; Hakiki, Farizal; Syihab, Z.; Ambia, F.; Yasutra, A.; Sutopo, S.; Efendi, M.; Sitompul, V.; Primasari, I.; Apriandi, R.

    2017-01-01

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  12. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.

    2017-10-17

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  13. Automatic annotation of protein motif function with Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Vanathi

    2004-09-01

    Full Text Available Abstract Background Conserved protein sequence motifs are short stretches of amino acid sequence patterns that potentially encode the function of proteins. Several sequence pattern searching algorithms and programs exist foridentifying candidate protein motifs at the whole genome level. However, amuch needed and importanttask is to determine the functions of the newly identified protein motifs. The Gene Ontology (GO project is an endeavor to annotate the function of genes or protein sequences with terms from a dynamic, controlled vocabulary and these annotations serve well as a knowledge base. Results This paperpresents methods to mine the GO knowledge base and use the association between the GO terms assigned to a sequence and the motifs matched by the same sequence as evidence for predicting the functions of novel protein motifs automatically. The task of assigning GO terms to protein motifsis viewed as both a binary classification and information retrieval problem, where PROSITE motifs are used as samples for mode training and functional prediction. The mutual information of a motif and aGO term association isfound to be a very useful feature. We take advantageof the known motifs to train a logistic regression classifier, which allows us to combine mutual information with other frequency-based features and obtain a probability of correctassociation. The trained logistic regression model has intuitively meaningful and logically plausible parameter values, and performs very well empirically according to our evaluation criteria. Conclusions In this research, different methods for automatic annotation of protein motifs have been investigated. Empirical result demonstrated that the methods have a great potential for detecting and augmenting information about thefunctions of newly discovered candidate protein motifs.

  14. Group Dynamics in Automatic Imitation.

    Science.gov (United States)

    Gleibs, Ilka H; Wilson, Neil; Reddy, Geetha; Catmur, Caroline

    Imitation-matching the configural body movements of another individual-plays a crucial part in social interaction. We investigated whether automatic imitation is not only influenced by who we imitate (ingroup vs. outgroup member) but also by the nature of an expected interaction situation (competitive vs. cooperative). In line with assumptions from Social Identity Theory), we predicted that both social group membership and the expected situation impact on the level of automatic imitation. We adopted a 2 (group membership target: ingroup, outgroup) x 2 (situation: cooperative, competitive) design. The dependent variable was the degree to which participants imitated the target in a reaction time automatic imitation task. 99 female students from two British Universities participated. We found a significant two-way interaction on the imitation effect. When interacting in expectation of cooperation, imitation was stronger for an ingroup target compared to an outgroup target. However, this was not the case in the competitive condition where imitation did not differ between ingroup and outgroup target. This demonstrates that the goal structure of an expected interaction will determine the extent to which intergroup relations influence imitation, supporting a social identity approach.

  15. TMB: Automatic Differentiation and Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Kasper Kristensen

    2016-04-01

    Full Text Available TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011. In addition, it offers easy access to parallel computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three of the joint likelihood. The computations are designed to be fast for problems with many random effects (≈ 106 and parameters (≈ 103 . Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems. The package and examples are available at http://tmb-project.org/.

  16. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  17. SAFETY MANAGEMENT FOR WOMEN THROUGH AUTOMATIC GPS LOCATION TRACKER

    OpenAIRE

    P.Nivetha*1, S.Kiruthika2 & J.B.Kavitha3

    2018-01-01

    The project “SAFETY MANAGEMENT FOR WOMEN THROUGH AUTOMATIC GPS LOCATION TRACKER” is designed using Standard Android 4.0.3 platform. The platform used to develop the application is Eclipse IDE (Mars) with Java 1.6 Standard Edition. It’s an android app which will help people in their crucial time. For example if a person is in trouble and he needs a help so there should be an app through which he/she can contact with their one to help them by just clicking on one button, it will automatically s...

  18. A Compact Methodology to Understand, Evaluate, and Predict the Performance of Automatic Target Recognition

    Science.gov (United States)

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Chen, Yiping; Zhuang, Zhaowen; Cheng, Yongqiang; Deng, Bin; Wang, Liandong; Zeng, Yonghu; Gao, Lei

    2014-01-01

    This paper offers a compacted mechanism to carry out the performance evaluation work for an automatic target recognition (ATR) system: (a) a standard description of the ATR system's output is suggested, a quantity to indicate the operating condition is presented based on the principle of feature extraction in pattern recognition, and a series of indexes to assess the output in different aspects are developed with the application of statistics; (b) performance of the ATR system is interpreted by a quality factor based on knowledge of engineering mathematics; (c) through a novel utility called “context-probability” estimation proposed based on probability, performance prediction for an ATR system is realized. The simulation result shows that the performance of an ATR system can be accounted for and forecasted by the above-mentioned measures. Compared to existing technologies, the novel method can offer more objective performance conclusions for an ATR system. These conclusions may be helpful in knowing the practical capability of the tested ATR system. At the same time, the generalization performance of the proposed method is good. PMID:24967605

  19. Characteristics and design improvement of AP1000 automatic depressurization system

    International Nuclear Information System (INIS)

    Jin Fei

    2012-01-01

    Automatic depressurization system, as a specialty of AP1000 Design, enhances capability of mitigating design basis accidents for plant. Advancement of the system is discussed by comparing with traditional PWR design and analyzing system functions, such as depressurizing and venting. System design improvement during China Project performance is also described. At the end, suggestions for the system in China Project are listed. (author)

  20. A Network of Automatic Control Web-Based Laboratories

    Science.gov (United States)

    Vargas, Hector; Sanchez Moreno, J.; Jara, Carlos A.; Candelas, F. A.; Torres, Fernando; Dormido, Sebastian

    2011-01-01

    This article presents an innovative project in the context of remote experimentation applied to control engineering education. Specifically, the authors describe their experience regarding the analysis, design, development, and exploitation of web-based technologies within the scope of automatic control. This work is part of an inter-university…

  1. Sparse encoding of automatic visual association in hippocampal networks

    DEFF Research Database (Denmark)

    Hulme, Oliver J; Skov, Martin; Chadwick, Martin J

    2014-01-01

    Intelligent action entails exploiting predictions about associations between elements of ones environment. The hippocampus and mediotemporal cortex are endowed with the network topology, physiology, and neurochemistry to automatically and sparsely code sensori-cognitive associations that can...

  2. Predicting heat stress index in Sasso hens using automatic linear modeling and artificial neural network

    Science.gov (United States)

    Yakubu, A.; Oluremi, O. I. A.; Ekpo, E. I.

    2018-03-01

    There is an increasing use of robust analytical algorithms in the prediction of heat stress. The present investigation therefore, was carried out to forecast heat stress index (HSI) in Sasso laying hens. One hundred and sixty seven records on the thermo-physiological parameters of the birds were utilized. They were reared on deep litter and battery cage systems. Data were collected when the birds were 42- and 52-week of age. The independent variables fitted were housing system, age of birds, rectal temperature (RT), pulse rate (PR), and respiratory rate (RR). The response variable was HSI. Data were analyzed using automatic linear modeling (ALM) and artificial neural network (ANN) procedures. The ALM model building method involved Forward Stepwise using the F Statistic criterion. As regards ANN, multilayer perceptron (MLP) with back-propagation network was used. The ANN network was trained with 90% of the data set while 10% were dedicated to testing for model validation. RR and PR were the two parameters of utmost importance in the prediction of HSI. However, the fractional importance of RR was higher than that of PR in both ALM (0.947 versus 0.053) and ANN (0.677 versus 0.274) models. The two models also predicted HSI effectively with high degree of accuracy [r = 0.980, R 2 = 0.961, adjusted R 2 = 0.961, and RMSE = 0.05168 (ALM); r = 0.983, R 2 = 0.966; adjusted R 2 = 0.966, and RMSE = 0.04806 (ANN)]. The present information may be exploited in the development of a heat stress chart based largely on RR. This may aid detection of thermal discomfort in a poultry house under tropical and subtropical conditions.

  3. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  4. COMPARISON OF TREND PROJECTION METHODS AND BACKPROPAGATION PROJECTIONS METHODS TREND IN PREDICTING THE NUMBER OF VICTIMS DIED IN TRAFFIC ACCIDENT IN TIMOR TENGAH REGENCY, NUSA TENGGARA

    Directory of Open Access Journals (Sweden)

    Aleksius Madu

    2016-10-01

    Full Text Available The purpose of this study is to predict the number of traffic accident victims who died in Timor Tengah Regency with Trend Projection method and Backpropagation method, and compare the two methods based on the degree of guilt and predict the number traffic accident victims in the Timor Tengah Regency for the coming year. This research was conducted in Timor Tengah Regency where data used in this study was obtained from Police Unit in Timor Tengah Regency. The data is on the number of traffic accidents in Timor Tengah Regency from 2000 – 2013, which is obtained by a quantitative analysis with Trend Projection and Backpropagation method. The results of the data analysis predicting the number of traffic accidents victims using Trend Projection method obtained the best model which is the quadratic trend model with equation Yk = 39.786 + (3.297 X + (0.13 X2. Whereas by using back propagation method, it is obtained the optimum network that consists of 2 inputs, 3 hidden screens, and 1 output. Based on the error rates obtained, Back propagation method is better than the Trend Projection method which means that the predicting accuracy with Back propagation method is the best method to predict the number of traffic accidents victims in Timor Tengah Regency. Thus obtained predicting the numbers of traffic accident victims for the next 5 years (Years 2014-2018 respectively - are 106 person, 115 person, 115 person, 119 person and 120 person.   Keywords: Trend Projection, Back propagation, Predicting.

  5. On stylistic automatization of lexical units in various types of contexts

    Directory of Open Access Journals (Sweden)

    В В Зуева

    2009-12-01

    Full Text Available Stylistic automatization of lexical units in various types of contexts is investigated in this article. Following the works of Boguslav Havranek and other linguists of the Prague Linguistic School automatization is treated as a contextual narrowing of the meaning of a lexical unit to the level of its complete predictability in situational contexts and the lack of stylistic contradiction with other lexical units in speech.

  6. A SIMULATION ENVIRONMENT FOR AUTOMATIC NIGHT DRIVING AND VISUAL CONTROL

    OpenAIRE

    Arroyo Rubio, Fernando

    2012-01-01

    This project consists on developing an automatic night driving system in a simulation environment. The simulator I have used is TORCS. TORCS is an Open Source car racing simulator written in C++. It is used as an ordinary car racing game, as a IA racing game and as a research platform. The goal of this thesis is to implement an automatic driving system to control the car under night conditions using computer vision. A camera is implemented inside the vehicle and it will detect the reflective ...

  7. Automatic Texture and Orthophoto Generation from Registered Panoramic Views

    DEFF Research Database (Denmark)

    Krispel, Ulrich; Evers, Henrik Leander; Tamke, Martin

    2015-01-01

    are automatically identified from the geometry and an image per view is created via projection. We combine methods of computer vision to train a classifier to detect the objects of interest from these orthographic views. Furthermore, these views can be used for automatic texturing of the proxy geometry....... from range data only. In order to detect these elements, we developed a method that utilizes range data and color information from high-resolution panoramic images of indoor scenes, taken at the scanners position. A proxy geometry is derived from the point clouds; orthographic views of the scene...

  8. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, Anna Maria Merel; Zijlstra, I.A.; Gathier, C.S.; van den Berg, R.; Slump, Cornelis H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for

  9. Early maladaptive schemas and social anxiety in adolescents: the mediating role of anxious automatic thoughts.

    Science.gov (United States)

    Calvete, Esther; Orue, Izaskun; Hankin, Benjamin L

    2013-04-01

    Cognitive models state that cognitions are organized hierarchically, so that the underlying schemas affect behavior via more automatic, superficial cognitive processes. This study aimed to demonstrate that early maladaptive schemas predict anxious automatic thoughts, and to show that such automatic thoughts act as mediators between schemas and prospective changes in social anxiety symptoms. The study also examined an alternative reverse model in which schemas acted as mediators between automatic thoughts and social anxiety. A total of 1052 adolescents (499 girls and 553 boys; M(age)=13.43; SD(age)=1.29) completed measures of early maladaptive schemas, socially anxious automatic thoughts, and social anxiety symptoms at Times 1, 2, and 3. The results revealed bidirectional longitudinal relationships among schemas and automatic thoughts that were consistent in content (e.g., the disconnection/rejection schemas and automatic thoughts of negative self-concept). Furthermore, the automatic thoughts of anticipatory negative evaluation by others at Time 2 mediated the relationship between the other-directedness schemas at Time 1 and social anxiety symptoms at Time 3. These findings are consistent with hierarchical cognitive models of social anxiety given that deeper schemas predict more surface-level thoughts. They also support that these more surface-level thoughts contribute to perpetuating schemas. Finally, results show that early maladaptive schemas of the other-directedness domain play a relevant role in the development and maintenance of social anxiety. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Automatic blood detection in capsule endoscopy video

    Czech Academy of Sciences Publication Activity Database

    Novozámský, Adam; Flusser, Jan; Tachecí, I.; Sulík, L.; Bureš, J.; Krejcar, O.

    2016-01-01

    Roč. 21, č. 12 (2016), s. 1-8, č. článku 126007. ISSN 1083-3668 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Automatic blood detection * capsule endoscopy video Subject RIV: JD - Computer Applications, Robotics Impact factor: 2.530, year: 2016 http://library.utia.cas.cz/separaty/2016/ZOI/flusser-0466936.pdf

  11. Geospatial application of the Water Erosion Prediction Project (WEPP) model

    Science.gov (United States)

    D. C. Flanagan; J. R. Frankenberger; T. A. Cochrane; C. S. Renschler; W. J. Elliot

    2013-01-01

    At the hillslope profile and/or field scale, a simple Windows graphical user interface (GUI) is available to easily specify the slope, soil, and management inputs for application of the USDA Water Erosion Prediction Project (WEPP) model. Likewise, basic small watershed configurations of a few hillslopes and channels can be created and simulated with this GUI. However,...

  12. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  13. Automatic production of Iodine-123 with PLC 135/U

    International Nuclear Information System (INIS)

    Moghaddam-Banaem, L.; Afarideh, H.

    2004-01-01

    In this project, the automatic system for production of Iodine-123 with PLC/135μ Siemens, which is designed and installed for the first time in Iran, is discussed. The PLC (Programmable Logic Controller) is used to control industrial processing, which is similar to a computer and consists of central processing unit and memory and Input/Output units. PLC receives input information from auxiliary units such as sensors, switches, etc. and software processes data in memory and then sends commands to output units such as relays, motors, etc.The target section in Iodine production consists of 8 stages. In order to be sure automation works properly the system can be operated both manually and automatically. First PLC checks Manual/Automatic switch and in the case of automatic mode, PLC runs the program in memory and processing is done automatically. For this purpose, PLC takes the value of pressures and temperatures from analog inputs and after processing them it sends commands to digital output to activate valves or vacuum pumps or heaters. In this paper the following subjects are discussed: 1) Production of Iodine 123 2) PLC structure and auxiliary boards 3) Sensors and actuators and their connection to PLC 4) Software flowchart

  14. Macroweather Predictions and Climate Projections using Scaling and Historical Observations

    Science.gov (United States)

    Hébert, R.; Lovejoy, S.; Del Rio Amador, L.

    2017-12-01

    There are two fundamental time scales that are pertinent to decadal forecasts and multidecadal projections. The first is the lifetime of planetary scale structures, about 10 days (equal to the deterministic predictability limit), and the second is - in the anthropocene - the scale at which the forced anthropogenic variability exceeds the internal variability (around 16 - 18 years). These two time scales define three regimes of variability: weather, macroweather and climate that are respectively characterized by increasing, decreasing and then increasing varibility with scale.We discuss how macroweather temperature variability can be skilfully predicted to its theoretical stochastic predictability limits by exploiting its long-range memory with the Stochastic Seasonal and Interannual Prediction System (StocSIPS). At multi-decadal timescales, the temperature response to forcing is approximately linear and this can be exploited to make projections with a Green's function, or Climate Response Function (CRF). To make the problem tractable, we exploit the temporal scaling symmetry and restrict our attention to global mean forcing and temperature response using a scaling CRF characterized by the scaling exponent H and an inner scale of linearity τ. An aerosol linear scaling factor α and a non-linear volcanic damping exponent ν were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference using historical data and these allow us to analytically calculate a median (and likely 66% range) for the transient climate response, and for the equilibrium climate sensitivity: 1.6K ([1.5,1.8]K) and 2.4K ([1.9,3.4]K) respectively. Aerosol forcing typically has large uncertainty and we find a modern (2005) forcing very likely range (90%) of [-1.0, -0.3] Wm-2 with median at -0.7 Wm-2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to Representative

  15. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  16. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  17. The automaticity of vantage point shifts within a synaesthetes' spatial calendar.

    Science.gov (United States)

    Jarick, Michelle; Jensen, Candice; Dixon, Michael J; Smilek, Daniel

    2011-09-01

    Time-space synaesthetes report that time units (e.g., months, days, hours) occupy idiosyncratic spatial locations. For the synaesthete (L), the months of the year are projected out in external space in the shape of a 'scoreboard 7', where January to July extend across the top from left to right and August to December make up the vertical segment from top to bottom. Interestingly, L can change the mental vantage point (MVP) from where she views her month-space depending on whether she sees or hears the month name. We used a spatial cueing task to demonstrate that L's attention could be directed to locations within her time-space and change vantage points automatically - from trial to trial. We also sought to eliminate any influence of strategy on L's performance by shortening the interval between the cue and target onset to only 150 ms, and have the targets fall in synaesthetically cued locations on only 15% of trials. If L's performance was attributable to intentionally using the cue to predict target location, these manipulations should eliminate any cueing effects. In two separate experiments, we found that L still showed an attentional bias consistent with her synaesthesia. Thus, we attribute L's rapid and resilient cueing effects to the automaticity of her spatial forms. ©2011 The British Psychological Society.

  18. The Masculinity of Money: Automatic Stereotypes Predict Gender Differences in Estimated Salaries

    Science.gov (United States)

    Williams, Melissa J.; Paluck, Elizabeth Levy; Spencer-Rodgers, Julie

    2010-01-01

    We present the first empirical investigation of why men are assumed to earn higher salaries than women (the "salary estimation effect"). Although this phenomenon is typically attributed to conscious consideration of the national wage gap (i.e., real inequities in salary), we hypothesize instead that it reflects differential, automatic economic…

  19. Automatic document navigation for digital content remastering

    Science.gov (United States)

    Lin, Xiaofan; Simske, Steven J.

    2003-12-01

    This paper presents a novel method of automatically adding navigation capabilities to re-mastered electronic books. We first analyze the need for a generic and robust system to automatically construct navigation links into re-mastered books. We then introduce the core algorithm based on text matching for building the links. The proposed method utilizes the tree-structured dictionary and directional graph of the table of contents to efficiently conduct the text matching. Information fusion further increases the robustness of the algorithm. The experimental results on the MIT Press digital library project are discussed and the key functional features of the system are illustrated. We have also investigated how the quality of the OCR engine affects the linking algorithm. In addition, the analogy between this work and Web link mining has been pointed out.

  20. Controlled cooling of an electronic system based on projected conditions

    Science.gov (United States)

    David, Milnes P.; Iyengar, Madhusudan K.; Schmidt, Roger R.

    2015-08-18

    Energy efficient control of a cooling system cooling an electronic system is provided based, in part, on projected conditions. The control includes automatically determining an adjusted control setting(s) for an adjustable cooling component(s) of the cooling system. The automatically determining is based, at least in part, on projected power consumed by the electronic system at a future time and projected temperature at the future time of a heat sink to which heat extracted is rejected. The automatically determining operates to reduce power consumption of the cooling system and/or the electronic system while ensuring that at least one targeted temperature associated with the cooling system or the electronic system is within a desired range. The automatically determining may be based, at least in part, on an experimentally obtained model(s) relating the targeted temperature and power consumption of the adjustable cooling component(s) of the cooling system.

  1. Development project of an automatic sampling system for part time unmanned pipeline terminals

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Gullherme O.; De Almelda, Marcio M. G.; Ramos, Ricardo R. [Petrobas, (Brazil); Potten, Gary [Cameron Measurement Systems, (United States)

    2010-07-01

    The Sao Paulo - Brasilia Pipeline (OSBRA) is a highly automated pipeline using a SCADA system which operates from a control room. A new quality management system standard was established for transportation and storage operations. The products had to be sampled on an automatic basis. This paper reports the development of an automatic sampling system (ASS) in accordance with the new quality control standard. The prototype was developed to be implemented through a human-machine interface (HMI) from the control room SCADA screens. A technical cooperation agreement(TCA) was drawn up for development of this new ASS product. The TCA was a joint cooperation between the Holding, the Operator and the cooperators. The prototype will be on-field tested at Senador Canedo tank farm to SPEC requirements. The current performance of the ASS establishes reasonable expectations for further successful development.

  2. Automated metadata--final project report

    International Nuclear Information System (INIS)

    Schissel, David

    2016-01-01

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO project was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project's toolkit. Feedback was very positive on the project's toolkit and the value of such automatic workflow documentation to the scientific endeavor.

  3. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Jones, J.P.

    1993-01-01

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  4. Physics-based Space Weather Forecasting in the Project for Solar-Terrestrial Environment Prediction (PSTEP) in Japan

    Science.gov (United States)

    Kusano, K.

    2016-12-01

    Project for Solar-Terrestrial Environment Prediction (PSTEP) is a Japanese nation-wide research collaboration, which was recently launched. PSTEP aims to develop a synergistic interaction between predictive and scientific studies of the solar-terrestrial environment and to establish the basis for next-generation space weather forecasting using the state-of-the-art observation systems and the physics-based models. For this project, we coordinate the four research groups, which develop (1) the integration of space weather forecast system, (2) the physics-based solar storm prediction, (3) the predictive models of magnetosphere and ionosphere dynamics, and (4) the model of solar cycle activity and its impact on climate, respectively. In this project, we will build the coordinated physics-based model to answer the fundamental questions concerning the onset of solar eruptions and the mechanism for radiation belt dynamics in the Earth's magnetosphere. In this paper, we will show the strategy of PSTEP, and discuss about the role and prospect of the physics-based space weather forecasting system being developed by PSTEP.

  5. Automatic Task Classification via Support Vector Machine and Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Hyungsik Shin

    2018-01-01

    Full Text Available Automatic task classification is a core part of personal assistant systems that are widely used in mobile devices such as smartphones and tablets. Even though many industry leaders are providing their own personal assistant services, their proprietary internals and implementations are not well known to the public. In this work, we show through real implementation and evaluation that automatic task classification can be implemented for mobile devices by using the support vector machine algorithm and crowdsourcing. To train our task classifier, we collected our training data set via crowdsourcing using the Amazon Mechanical Turk platform. Our classifier can classify a short English sentence into one of the thirty-two predefined tasks that are frequently requested while using personal mobile devices. Evaluation results show high prediction accuracy of our classifier ranging from 82% to 99%. By using large amount of crowdsourced data, we also illustrate the relationship between training data size and the prediction accuracy of our task classifier.

  6. Automatic Transformation of MPI Programs to Asynchronous, Graph-Driven Form

    Energy Technology Data Exchange (ETDEWEB)

    Baden, Scott B [University of California, San Diego; Weare, John H [University of California, San Diego; Bylaska, Eric J [Pacific Northwest National Laboratory

    2013-04-30

    The goals of this project are to develop new, scalable, high-fidelity algorithms for atomic-level simulations and program transformations that automatically restructure existing applications, enabling them to scale forward to Petascale systems and beyond. The techniques enable legacy MPI application code to exploit greater parallelism though increased latency hiding and improved workload assignment. The techniques were successfully demonstrated on high-end scalable systems located at DOE laboratories. Besides the automatic MPI program transformations efforts, the project also developed several new scalable algorithms for ab-initio molecular dynamics, including new massively parallel algorithms for hybrid DFT and new parallel in time algorithms for molecular dynamics and ab-initio molecular dynamics. These algorithms were shown to scale to very large number of cores, and they were designed to work in the latency hiding framework developed in this project. The effectiveness of the developments was enhanced by the direct application to real grand challenge simulation problems covering a wide range of technologically important applications, time scales and accuracies. These included the simulation of the electronic structure of mineral/fluid interfaces, the very accurate simulation of chemical reactions in microsolvated environments, and the simulation of chemical behavior in very large enzyme reactions.

  7. Fully automatic time-window selection using machine learning for global adjoint tomography

    Science.gov (United States)

    Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.

    2017-12-01

    Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error

  8. Adapting the Water Erosion Prediction Project (WEPP) model for forest applications

    Science.gov (United States)

    Shuhui Dun; Joan Q. Wu; William J. Elliot; Peter R. Robichaud; Dennis C. Flanagan; James R. Frankenberger; Robert E. Brown; Arthur C. Xu

    2009-01-01

    There has been an increasing public concern over forest stream pollution by excessive sedimentation due to natural or human disturbances. Adequate erosion simulation tools are needed for sound management of forest resources. The Water Erosion Prediction Project (WEPP) watershed model has proved useful in forest applications where Hortonian flow is the major form of...

  9. Autonomous Propellant Loading Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Autonomous Propellant Loading (APL) project consists of three activities. The first is to develop software that will automatically control loading of...

  10. Effect of accuracy of wind power prediction on power system operator

    Science.gov (United States)

    Schlueter, R. A.; Sigari, G.; Costi, T.

    1985-01-01

    This research project proposed a modified unit commitment that schedules connection and disconnection of generating units in response to load. A modified generation control is also proposed that controls steam units under automatic generation control, fast responding diesels, gas turbines and hydro units under a feedforward control, and wind turbine array output under a closed loop array control. This modified generation control and unit commitment require prediction of trend wind power variation one hour ahead and the prediction of error in this trend wind power prediction one half hour ahead. An improved meter for predicting trend wind speed variation is developed. Methods for accurately simulating the wind array power from a limited number of wind speed prediction records was developed. Finally, two methods for predicting the error in the trend wind power prediction were developed. This research provides a foundation for testing and evaluating the modified unit commitment and generation control that was developed to maintain operating reliability at a greatly reduced overall production cost for utilities with wind generation capacity.

  11. Visual Benefits in Apparent Motion Displays: Automatically Driven Spatial and Temporal Anticipation Are Partially Dissociated.

    Directory of Open Access Journals (Sweden)

    Merle-Marie Ahrens

    Full Text Available Many behaviourally relevant sensory events such as motion stimuli and speech have an intrinsic spatio-temporal structure. This will engage intentional and most likely unintentional (automatic prediction mechanisms enhancing the perception of upcoming stimuli in the event stream. Here we sought to probe the anticipatory processes that are automatically driven by rhythmic input streams in terms of their spatial and temporal components. To this end, we employed an apparent visual motion paradigm testing the effects of pre-target motion on lateralized visual target discrimination. The motion stimuli either moved towards or away from peripheral target positions (valid vs. invalid spatial motion cueing at a rhythmic or arrhythmic pace (valid vs. invalid temporal motion cueing. Crucially, we emphasized automatic motion-induced anticipatory processes by rendering the motion stimuli non-predictive of upcoming target position (by design and task-irrelevant (by instruction, and by creating instead endogenous (orthogonal expectations using symbolic cueing. Our data revealed that the apparent motion cues automatically engaged both spatial and temporal anticipatory processes, but that these processes were dissociated. We further found evidence for lateralisation of anticipatory temporal but not spatial processes. This indicates that distinct mechanisms may drive automatic spatial and temporal extrapolation of upcoming events from rhythmic event streams. This contrasts with previous findings that instead suggest an interaction between spatial and temporal attention processes when endogenously driven. Our results further highlight the need for isolating intentional from unintentional processes for better understanding the various anticipatory mechanisms engaged in processing behaviourally relevant stimuli with predictable spatio-temporal structure such as motion and speech.

  12. Automated metadata--final project report

    Energy Technology Data Exchange (ETDEWEB)

    Schissel, David [General Atomics, San Diego, CA (United States)

    2016-04-01

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO project was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.

  13. Robust methods for automatic image-to-world registration in cone-beam CT interventional guidance

    International Nuclear Information System (INIS)

    Dang, H.; Otake, Y.; Schafer, S.; Stayman, J. W.; Kleinszig, G.; Siewerdsen, J. H.

    2012-01-01

    Purpose: Real-time surgical navigation relies on accurate image-to-world registration to align the coordinate systems of the image and patient. Conventional manual registration can present a workflow bottleneck and is prone to manual error and intraoperator variability. This work reports alternative means of automatic image-to-world registration, each method involving an automatic registration marker (ARM) used in conjunction with C-arm cone-beam CT (CBCT). The first involves a Known-Model registration method in which the ARM is a predefined tool, and the second is a Free-Form method in which the ARM is freely configurable. Methods: Studies were performed using a prototype C-arm for CBCT and a surgical tracking system. A simple ARM was designed with markers comprising a tungsten sphere within infrared reflectors to permit detection of markers in both x-ray projections and by an infrared tracker. The Known-Model method exercised a predefined specification of the ARM in combination with 3D-2D registration to estimate the transformation that yields the optimal match between forward projection of the ARM and the measured projection images. The Free-Form method localizes markers individually in projection data by a robust Hough transform approach extended from previous work, backprojected to 3D image coordinates based on C-arm geometric calibration. Image-domain point sets were transformed to world coordinates by rigid-body point-based registration. The robustness and registration accuracy of each method was tested in comparison to manual registration across a range of body sites (head, thorax, and abdomen) of interest in CBCT-guided surgery, including cases with interventional tools in the radiographic scene. Results: The automatic methods exhibited similar target registration error (TRE) and were comparable or superior to manual registration for placement of the ARM within ∼200 mm of C-arm isocenter. Marker localization in projection data was robust across all

  14. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  15. EVA: continuous automatic evaluation of protein structure prediction servers.

    Science.gov (United States)

    Eyrich, V A; Martí-Renom, M A; Przybylski, D; Madhusudhan, M S; Fiser, A; Pazos, F; Valencia, A; Sali, A; Rost, B

    2001-12-01

    Evaluation of protein structure prediction methods is difficult and time-consuming. Here, we describe EVA, a web server for assessing protein structure prediction methods, in an automated, continuous and large-scale fashion. Currently, EVA evaluates the performance of a variety of prediction methods available through the internet. Every week, the sequences of the latest experimentally determined protein structures are sent to prediction servers, results are collected, performance is evaluated, and a summary is published on the web. EVA has so far collected data for more than 3000 protein chains. These results may provide valuable insight to both developers and users of prediction methods. http://cubic.bioc.columbia.edu/eva. eva@cubic.bioc.columbia.edu

  16. Face Prediction Model for an Automatic Age-invariant Face Recognition System

    OpenAIRE

    Yadav, Poonam

    2015-01-01

    07.11.14 KB. Emailed author re copyright. Author says that copyright is retained by author. Ok to add to spiral Automated face recognition and identi cation softwares are becoming part of our daily life; it nds its abode not only with Facebooks auto photo tagging, Apples iPhoto, Googles Picasa, Microsofts Kinect, but also in Homeland Security Departments dedicated biometric face detection systems. Most of these automatic face identification systems fail where the e ects of aging come into...

  17. [Usefulness and limitations of rapid automatized naming to predict reading difficulties after school entry in preschool children].

    Science.gov (United States)

    Kaneko, Masato; Uno, Akira; Haruhara, Noriko; Awaya, Noriko

    2012-01-01

    We investigated the usability and limitations of Rapid Automatized Naming (RAN) results in 6-year-old Japanese preschool children to estimate whether reading difficulties will be encountered after school entry. We administered a RAN task to 1,001 preschool children. Then after they had entered school, we performed follow-up surveys yearly to assess their reading performance when these children were in the first, second, third and fourth grades. Also, we examined Hiragana non-words and Kanji words at each time point to detect the children who were having difficulty with reading Hiragana and Kanji. Results by Receiver Operating Characteristic analysis showed that the RAN result in 6-year-old preschool children was predictive of Kanji reading difficulty in the lower grades of elementary school, especially in the second grade with a probability of 0.86, and the area under the curve showed a probability of 0.84 in the third grade. These results suggested that the RAN task was useful as a screening tool.

  18. a Method for the Seamlines Network Automatic Selection Based on Building Vector

    Science.gov (United States)

    Li, P.; Dong, Y.; Hu, Y.; Li, X.; Tan, P.

    2018-04-01

    In order to improve the efficiency of large scale orthophoto production of city, this paper presents a method for automatic selection of seamlines network in large scale orthophoto based on the buildings' vector. Firstly, a simple model of the building is built by combining building's vector, height and DEM, and the imaging area of the building on single DOM is obtained. Then, the initial Voronoi network of the measurement area is automatically generated based on the positions of the bottom of all images. Finally, the final seamlines network is obtained by optimizing all nodes and seamlines in the network automatically based on the imaging areas of the buildings. The experimental results show that the proposed method can not only get around the building seamlines network quickly, but also remain the Voronoi network' characteristics of projection distortion minimum theory, which can solve the problem of automatic selection of orthophoto seamlines network in image mosaicking effectively.

  19. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    Science.gov (United States)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  20. The Automatic Test Features of the IDiPS Reactor Protection System

    International Nuclear Information System (INIS)

    Hur, Seop; Kim, Dong-Hoon; Hwang, In-Koo; Lee, Cheol-Kwon; Lee, Dong-Young

    2007-01-01

    The reactor protection system (RPS) is designed to minimize a propagation of abnormal or accident conditions of nuclear power plants. A digital RPS (Integrated Digital Protection System (IDiPS) RPS) is being developed in the Korea Nuclear Instrumentation and Control System (KNICS) R and D project. To make good use of the advantages of the digital technology, it is necessary to improve the reliability and availability of a system through automatic test features including an on-line testing, a self-diagnostics, an auto calibration, etc. This paper summarizes the system test strategy and the automatic test features of the IDiPS RPS

  1. Predicting RNA Structure Using Mutual Information

    DEFF Research Database (Denmark)

    Freyhult, E.; Moulton, V.; Gardner, P. P.

    2005-01-01

    , to display and predict conserved RNA secondary structure (including pseudoknots) from an alignment. Results: We show that MIfold can be used to predict simple pseudoknots, and that the performance can be adjusted to make it either more sensitive or more selective. We also demonstrate that the overall...... package. Conclusion: MIfold provides a useful supplementary tool to programs such as RNA Structure Logo, RNAalifold and COVE, and should be useful for automatically generating structural predictions for databases such as Rfam. Availability: MIfold is freely available from http......Background: With the ever-increasing number of sequenced RNAs and the establishment of new RNA databases, such as the Comparative RNA Web Site and Rfam, there is a growing need for accurately and automatically predicting RNA structures from multiple alignments. Since RNA secondary structure...

  2. EBFA project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    An engineering project office was established during the fall of 1976 to manage and coordinate all of the activities of the Electron Beam Fusion Project. The goal of the project is to develop the Electron Beam Fusion Accelerator (EBFA) and its supporting systems, and integrate these systems into the new Electron Beam Fusion Facility (EBFF). Supporting systems for EBFA include a control/monitor system, a data acquistion/automatic data processing system, the liquid transfer systems, the insulating gas transfer systems, etc. Engineers and technicians were assigned to the project office to carry out the engineering design, initiate procurement, monitor the fabrication, perform the assembly and to assist the pulsed power research group in the activation of the EBFA

  3. GI-POP: a combinational annotation and genomic island prediction pipeline for ongoing microbial genome projects.

    Science.gov (United States)

    Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi

    2013-04-10

    Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Computer vision for automatic inspection of agricultural produce

    Science.gov (United States)

    Molto, Enrique; Blasco, Jose; Benlloch, Jose V.

    1999-01-01

    Fruit and vegetables suffer different manipulations from the field to the final consumer. These are basically oriented towards the cleaning and selection of the product in homogeneous categories. For this reason, several research projects, aimed at fast, adequate produce sorting and quality control are currently under development around the world. Moreover, it is possible to find manual and semi- automatic commercial system capable of reasonably performing these tasks.However, in many cases, their accuracy is incompatible with current European market demands, which are constantly increasing. IVIA, the Valencian Research Institute of Agriculture, located in Spain, has been involved in several European projects related with machine vision for real-time inspection of various agricultural produces. This paper will focus on the work related with two products that have different requirements: fruit and olives. In the case of fruit, the Institute has developed a vision system capable of providing assessment of the external quality of single fruit to a robot that also receives information from other senors. The system use four different views of each fruit and has been tested on peaches, apples and citrus. Processing time of each image is under 500 ms using a conventional PC. The system provides information about primary and secondary color, blemishes and their extension, and stem presence and position, which allows further automatic orientation of the fruit in the final box using a robotic manipulator. Work carried out in olives was devoted to fast sorting of olives for consumption at table. A prototype has been developed to demonstrate the feasibility of a machine vision system capable of automatically sorting 2500 kg/h olives using low-cost conventional hardware.

  5. Managing Returnable Containers Logistics - A Case Study Part II - Improving Visibility through Using Automatic Identification Technologies

    Directory of Open Access Journals (Sweden)

    Gretchen Meiser

    2011-05-01

    Full Text Available This case study is the result of a project conducted on behalf of a company that uses its own returnable containers to transport purchased parts from suppliers. The objective of this project was to develop a proposal to enable the company to more effectively track and manage its returnable containers. The research activities in support of this project included (1 the analysis and documentation of the physical flow and the information flow associated with the containers and (2 the investigation of new technologies to improve the automatic identification and tracking of containers. This paper explains the automatic identification technologies and important criteria for selection. A companion paper details the flow of information and containers within the logistics chain, and it identifies areas for improving the management of the containers.

  6. Automatic Migration from PARMACS to MPI in Parallel Fortran Applications

    Directory of Open Access Journals (Sweden)

    Rolf Hempel

    1999-01-01

    Full Text Available The PARMACS message passing interface has been in widespread use by application projects, especially in Europe. With the new MPI standard for message passing, many projects face the problem of replacing PARMACS with MPI. An automatic translation tool has been developed which replaces all PARMACS 6.0 calls in an application program with their corresponding MPI calls. In this paper we describe the mapping of the PARMACS programming model onto MPI. We then present some implementation details of the converter tool.

  7. Decadal climate prediction (project GCEP).

    Science.gov (United States)

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability.

  8. Towards Automatic Personalized Content Generation for Platform Games

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2010-01-01

    In this paper, we show that personalized levels can be automatically generated for platform games. We build on previous work, where models were derived that predicted player experience based on features of level design and on playing styles. These models are constructed using preference learning...... mechanism using both algorithmic and human players. The results indicate that the adaptation mechanism effectively optimizes level design parameters for particular players....

  9. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  10. ATLAAS: an automatic decision tree-based learning algorithm for advanced image segmentation in positron emission tomography.

    Science.gov (United States)

    Berthon, Beatrice; Marshall, Christopher; Evans, Mererid; Spezi, Emiliano

    2016-07-07

    Accurate and reliable tumour delineation on positron emission tomography (PET) is crucial for radiotherapy treatment planning. PET automatic segmentation (PET-AS) eliminates intra- and interobserver variability, but there is currently no consensus on the optimal method to use, as different algorithms appear to perform better for different types of tumours. This work aimed to develop a predictive segmentation model, trained to automatically select and apply the best PET-AS method, according to the tumour characteristics. ATLAAS, the automatic decision tree-based learning algorithm for advanced segmentation is based on supervised machine learning using decision trees. The model includes nine PET-AS methods and was trained on a 100 PET scans with known true contour. A decision tree was built for each PET-AS algorithm to predict its accuracy, quantified using the Dice similarity coefficient (DSC), according to the tumour volume, tumour peak to background SUV ratio and a regional texture metric. The performance of ATLAAS was evaluated for 85 PET scans obtained from fillable and printed subresolution sandwich phantoms. ATLAAS showed excellent accuracy across a wide range of phantom data and predicted the best or near-best segmentation algorithm in 93% of cases. ATLAAS outperformed all single PET-AS methods on fillable phantom data with a DSC of 0.881, while the DSC for H&N phantom data was 0.819. DSCs higher than 0.650 were achieved in all cases. ATLAAS is an advanced automatic image segmentation algorithm based on decision tree predictive modelling, which can be trained on images with known true contour, to predict the best PET-AS method when the true contour is unknown. ATLAAS provides robust and accurate image segmentation with potential applications to radiation oncology.

  11. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    Regional hydrological models are important tools in water resources management. Model prediction uncertainty is primarily due to structural (geological) non-uniqueness which makes sampling of the structural model space necessary to estimate prediction uncertainties. Geological structures and hete...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  12. Automatic segmentation of mandible in panoramic x-ray

    OpenAIRE

    Abdi, Amir Hossein; Kasaei, Shohreh; Mehdizadeh, Mojdeh

    2015-01-01

    As the panoramic x-ray is the most common extraoral radiography in dentistry, segmentation of its anatomical structures facilitates diagnosis and registration of dental records. This study presents a fast and accurate method for automatic segmentation of mandible in panoramic x-rays. In the proposed four-step algorithm, a superior border is extracted through horizontal integral projections. A modified Canny edge detector accompanied by morphological operators extracts the inferior border of t...

  13. WE-AB-BRA-05: Fully Automatic Segmentation of Male Pelvic Organs On CT Without Manual Intervention

    International Nuclear Information System (INIS)

    Gao, Y; Lian, J; Chen, R; Wang, A; Shen, D

    2015-01-01

    Purpose: We aim to develop a fully automatic tool for accurate contouring of major male pelvic organs in CT images for radiotherapy without any manual initialization, yet still achieving superior performance than the existing tools. Methods: A learning-based 3D deformable shape model was developed for automatic contouring. Specifically, we utilized a recent machine learning method, random forest, to jointly learn both image regressor and classifier for each organ. In particular, the image regressor is trained to predict the 3D displacement from each vertex of the 3D shape model towards the organ boundary based on the local image appearance around the location of this vertex. The predicted 3D displacements are then used to drive the 3D shape model towards the target organ. Once the shape model is deformed close to the target organ, it is further refined by an organ likelihood map estimated by the learned classifier. As the organ likelihood map provides good guideline for the organ boundary, the precise contouring Result could be achieved, by deforming the 3D shape model locally to fit boundaries in the organ likelihood map. Results: We applied our method to 29 previously-treated prostate cancer patients, each with one planning CT scan. Compared with manually delineated pelvic organs, our method obtains overlap ratios of 85.2%±3.74% for the prostate, 94.9%±1.62% for the bladder, and 84.7%±1.97% for the rectum, respectively. Conclusion: This work demonstrated feasibility of a novel machine-learning based approach for accurate and automatic contouring of major male pelvic organs. It shows the potential to replace the time-consuming and inconsistent manual contouring in the clinic. Also, compared with the existing works, our method is more accurate and also efficient since it does not require any manual intervention, such as manual landmark placement. Moreover, our method obtained very similar contouring results as the clinical experts. Project is partially support

  14. WE-AB-BRA-05: Fully Automatic Segmentation of Male Pelvic Organs On CT Without Manual Intervention

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Y; Lian, J; Chen, R; Wang, A; Shen, D [Univ North Carolina, Chapel Hill, NC (United States)

    2015-06-15

    Purpose: We aim to develop a fully automatic tool for accurate contouring of major male pelvic organs in CT images for radiotherapy without any manual initialization, yet still achieving superior performance than the existing tools. Methods: A learning-based 3D deformable shape model was developed for automatic contouring. Specifically, we utilized a recent machine learning method, random forest, to jointly learn both image regressor and classifier for each organ. In particular, the image regressor is trained to predict the 3D displacement from each vertex of the 3D shape model towards the organ boundary based on the local image appearance around the location of this vertex. The predicted 3D displacements are then used to drive the 3D shape model towards the target organ. Once the shape model is deformed close to the target organ, it is further refined by an organ likelihood map estimated by the learned classifier. As the organ likelihood map provides good guideline for the organ boundary, the precise contouring Result could be achieved, by deforming the 3D shape model locally to fit boundaries in the organ likelihood map. Results: We applied our method to 29 previously-treated prostate cancer patients, each with one planning CT scan. Compared with manually delineated pelvic organs, our method obtains overlap ratios of 85.2%±3.74% for the prostate, 94.9%±1.62% for the bladder, and 84.7%±1.97% for the rectum, respectively. Conclusion: This work demonstrated feasibility of a novel machine-learning based approach for accurate and automatic contouring of major male pelvic organs. It shows the potential to replace the time-consuming and inconsistent manual contouring in the clinic. Also, compared with the existing works, our method is more accurate and also efficient since it does not require any manual intervention, such as manual landmark placement. Moreover, our method obtained very similar contouring results as the clinical experts. Project is partially support

  15. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  16. Data Flow for the TERRA-REF project

    Science.gov (United States)

    Kooper, R.; Burnette, M.; Maloney, J.; LeBauer, D.

    2017-12-01

    The Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform (TERRA-REF) program aims to identify crop traits that are best suited to producing high-energy sustainable biofuels and match those plant characteristics to their genes to speed the plant breeding process. One tool used to achieve this goal is a high-throughput phenotyping robot outfitted with sensors and cameras to monitor the growth of 1.25 acres of sorghum. Data types range from hyperspectral imaging to 3D reconstructions and thermal profiles, all at 1mm resolution. This system produces thousands of daily measurements with high spatiotemporal resolution. The team at NCSA processes, annotates, organizes and stores the massive amounts of data produced by this system - up to 5 TB per day. Data from the sensors is streamed to a local gantry-cache server. The standardized sensor raw data stream is automatically and securely delivered to NCSA using Globus Connect service. Once files have been successfully received by the Globus endpoint, the files are removed from the gantry-cache server. As each dataset arrives or is created the Clowder system automatically triggers different software tools to analyze each file, extract information, and convert files to a common format. Other tools can be triggered to run after all required data is uploaded. For example, a stitched image of the entire field is created after all images of the field become available. Some of these tools were developed by external collaborators based on predictive models and algorithms, others were developed as part of other projects and could be leveraged by the TERRA project. Data will be stored for the lifetime of the project and is estimated to reach 10 PB over 3 years. The Clowder system, BETY and other systems will allow users to easily find data by browsing or searching the extracted information.

  17. The Role of Automatic Obesity Stereotypes in Real Hiring Discrimination

    Science.gov (United States)

    Agerstrom, Jens; Rooth, Dan-Olof

    2011-01-01

    This study examined whether automatic stereotypes captured by the implicit association test (IAT) can predict real hiring discrimination against the obese. In an unobtrusive field experiment, job applications were sent to a large number of real job vacancies. The applications were matched on credentials but differed with respect to the applicant's…

  18. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  19. Prediction of optimal deployment projection for transcatheter aortic valve replacement: angiographic 3-dimensional reconstruction of the aortic root versus multidetector computed tomography.

    Science.gov (United States)

    Binder, Ronald K; Leipsic, Jonathon; Wood, David; Moore, Teri; Toggweiler, Stefan; Willson, Alex; Gurvitch, Ronen; Freeman, Melanie; Webb, John G

    2012-04-01

    Identifying the optimal fluoroscopic projection of the aortic valve is important for successful transcatheter aortic valve replacement (TAVR). Various imaging modalities, including multidetector computed tomography (MDCT), have been proposed for prediction of the optimal deployment projection. We evaluated a method that provides 3-dimensional angiographic reconstructions (3DA) of the aortic root for prediction of the optimal deployment angle and compared it with MDCT. Forty patients undergoing transfemoral TAVR at St Paul's Hospital, Vancouver, Canada, were evaluated. All underwent preimplant 3DA and 68% underwent preimplant MDCT. Three-dimensional angiographic reconstructions were generated from images of a C-arm rotational aortic root angiogram during breath-hold, rapid ventricular pacing, and injection of 32 mL contrast medium at 8 mL/s. Two independent operators prospectively predicted perpendicular valve projections. The implant angle was chosen at the discretion of the physician performing TAVR. The angles from 3DA, from MDCT, the implant angle, and the postdeployment perpendicular prosthesis view were compared. The shortest distance from the postdeployment perpendicular prosthesis projection to the regression line of predicted perpendicular projections was calculated. All but 1 patient had adequate image quality for reproducible angle predictions. There was a significant correlation between 3DA and MDCT for prediction of perpendicular valve projections (r=0.682, Pregression line of predicted angles to the postdeployment prosthesis view was 5.1±4.6° for 3DA and 7.9±4.9° for MDCT (P=0.01). Three-dimensional angiographic reconstructions and MDCT are safe, practical, and accurate imaging modalities for identifying the optimal perpendicular valve deployment projection during TAVR.

  20. Identification with video game characters as automatic shift of self-perceptions

    NARCIS (Netherlands)

    Klimmt, C.; Hefner, D.; Vorderer, P.A.; Roth, C.; Blake, C.

    2010-01-01

    Two experiments tested the prediction that video game players identify with the character or role they are assigned, which leads to automatic shifts in implicit self-perceptions. Video game identification, thus, is considered as a kind of altered self-experience. In Study 1 (N = 61), participants

  1. Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.

    Science.gov (United States)

    Weijers, Gert; Starke, Alexander; Haudum, Alois; Thijssen, Johan M; Rehage, Jürgen; De Korte, Chris L

    2010-07-01

    to predict TAG level in the liver. Receiver-operating-characteristics (ROC) analysis was applied to assess the performance and area under the curve (AUC) of predicting TAG and to compare the sensitivity and specificity of the methods. Best speckle-size estimates and overall performance (R2 = 0.71, AUC = 0.94) were achieved by using an SNR-based adaptive automatic-segmentation method (used TAG threshold: 50 mg/g liver wet weight). Automatic segmentation is thus feasible and profitable.

  2. Genomic Prediction from Whole Genome Sequence in Livestock: The 1000 Bull Genomes Project

    DEFF Research Database (Denmark)

    Hayes, Benjamin J; MacLeod, Iona M; Daetwyler, Hans D

    Advantages of using whole genome sequence data to predict genomic estimated breeding values (GEBV) include better persistence of accuracy of GEBV across generations and more accurate GEBV across breeds. The 1000 Bull Genomes Project provides a database of whole genome sequenced key ancestor bulls....... In a dairy data set, predictions using BayesRC and imputed sequence data from 1000 Bull Genomes were 2% more accurate than with 800k data. We could demonstrate the method identified causal mutations in some cases. Further improvements will come from more accurate imputation of sequence variant genotypes...

  3. Automatic initialization and quality control of large-scale cardiac MRI segmentations.

    Science.gov (United States)

    Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F

    2018-01-01

    Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The

  4. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  5. Techniques for Automatic Creation of Terrain Databases for Training and Mission Preparation

    NARCIS (Netherlands)

    Kuijper, F.; Son, R. van; Meurs, F. van; Smelik, R.M.; Kraker, J.K. de

    2010-01-01

    In the support of defense agencies and civil authorities TNO runs a research program that strives after automatic generation of terrain databases for a variety of simulation applications. Earlier papers by TNO at the IMAGE conference have reported in-depth on specific projects within this program.

  6. Predicting outcome following psychological therapy in IAPT (PROMPT): a naturalistic project protocol.

    Science.gov (United States)

    Grant, Nina; Hotopf, Matthew; Breen, Gerome; Cleare, Anthony; Grey, Nick; Hepgul, Nilay; King, Sinead; Moran, Paul; Pariante, Carmine M; Wingrove, Janet; Young, Allan H; Tylee, André

    2014-06-09

    Depression and anxiety are highly prevalent and represent a significant and well described public health burden. Whilst first line psychological treatments are effective for nearly half of attenders, there remain a substantial number of patients who do not benefit. The main objective of the present project is to establish an infrastructure platform for the identification of factors that predict lack of response to psychological treatment for depression and anxiety, in order to better target treatments as well as to support translational and experimental medicine research in mood and anxiety disorders. Predicting outcome following psychological therapy in IAPT (PROMPT) is a naturalistic observational project that began patient recruitment in January 2014. The project is currently taking place in Southwark Psychological Therapies Service, an Improving Access to Psychological Therapies (IAPT) service currently provided by the South London and Maudsley NHS Foundation Trust (SLaM). However, the aim is to roll-out the project across other IAPT services. Participants are approached before beginning treatment and offered a baseline interview whilst they are waiting for therapy to begin. This allows us to test for relationships between predictor variables and patient outcome measures. At the baseline interview, participants complete a diagnostic interview; are asked to give blood and hair samples for relevant biomarkers, and complete psychological and social questionnaire measures. Participants then complete their psychological therapy as offered by Southwark Psychological Therapies Service. Response to psychological therapy will be measured using standard IAPT outcome data, which are routinely collected at each appointment. This project addresses a need to understand treatment response rates in primary care psychological therapy services for those with depression and/or anxiety. Measurement of a range of predictor variables allows for the detection of bio

  7. Design and Simulation of Two Robotic Systems for Automatic Artichoke Harvesting

    Directory of Open Access Journals (Sweden)

    Domenico Longo

    2013-12-01

    Full Text Available The target of this research project was a feasibility study for the development of a robot for automatic or semi-automatic artichoke harvesting. During this project, different solutions for the mechanical parts of the machine, its control system and the harvesting tools were investigated. Moreover, in cooperation with the department DISPA of University of Catania, different field structures with different kinds of artichoke cultivars were studied and tested. The results of this research could improve artichoke production for preserves industries. As a first step, an investigation on existing machines has been done. From this research, it has been shown that very few machines exist for this purpose. Based also on previous experiences, some proposals for different robotic systems have been done, while the mobile platform itself was developed within another research project. At the current stage, several different configurations of machines and harvesting end-effectors have been designed and simulated using a 3D CAD environment interfaced with Matlab®. Moreover, as support for one of the proposed machines, an artificial vision algorithm has been developed in order to locate the artichokes on the plant, with respect to the robot, using images taken with a standard webcam.

  8. Implicit self-esteem compensation: automatic threat defense.

    Science.gov (United States)

    Rudman, Laurie A; Dohn, Matthew C; Fairchild, Kimberly

    2007-11-01

    Four experiments demonstrated implicit self-esteem compensation (ISEC) in response to threats involving gender identity (Experiment 1), implicit racism (Experiment 2), and social rejection (Experiments 3-4). Under conditions in which people might be expected to suffer a blow to self-worth, they instead showed high scores on 2 implicit self-esteem measures. There was no comparable effect on explicit self-esteem. However, ISEC was eliminated following self-affirmation (Experiment 3). Furthermore, threat manipulations increased automatic intergroup bias, but ISEC mediated these relationships (Experiments 2-3). Thus, a process that serves as damage control for the self may have negative social consequences. Finally, pretest anxiety mediated the relationship between threat and ISEC (Experiment 3), whereas ISEC negatively predicted anxiety among high-threat participants (Experiment 4), suggesting that ISEC may function to regulate anxiety. The implications of these findings for automatic emotion regulation, intergroup bias, and implicit self-esteem measures are discussed. (c) 2007 APA, all rights reserved.

  9. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Science.gov (United States)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  10. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  11. On the reliability of predictions of geomechanical response - project Cosa in perspective

    International Nuclear Information System (INIS)

    Knowles, N.C.; Lowe, M.J.S.; Come, B.

    1990-01-01

    Project COSA (Comparison of computer codes for Salt) was set up by the CEC as international benchmark exercise to compare the reliability of predictions of thermo-mechanical response of HLW repositories in salt. The first phase (COSA I) was conducted between 1984-1986 and attention was directed at code verification issues. The second phase (COSA II), carried out in the period 1986-1988, addressed code validation and other issues. Specifically, a series of experimental heat and pressure tests carried out at the Asse Mine in Wast Germany were modelled and predictions of the thermo-mechanical behaviour were compared. Ten European organisations participated. A key feature of this exercise was that, as far as possible, the calculations were performed blind (i.e. without any knowledge of the observed behaviour) using the best information available a priori, to describe the physical situation to be modelled. Interest centred around the various constitutive models (of material behaviour) for rock-salt and the assumptions about the in situ state of stress. The paper gives an overview of the project, presents some broad conclusions and attempts to assess their significance. 17 refs., 6 figs., 2 tabs

  12. Automatized distribution systems in IBERDROLA. Sistemas de automatizacion de distribucion en Iberdrola

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Madariaga, J.A.

    1994-01-01

    This article presents the automatized distribution systems in IBERDROLA. These systems permit to improve the Energetical demand management. The optimized distribution system is a applied by the industrial sector and by the small users. Iberdrola has developed a project in order to offer the telemanagement to the energy users.

  13. USING AFFORDABLE DATA CAPTURING DEVICES FOR AUTOMATIC 3D CITY MODELLING

    Directory of Open Access Journals (Sweden)

    B. Alizadehashrafi

    2017-11-01

    Full Text Available In this research project, many movies from UTM Kolej 9, Skudai, Johor Bahru (See Figure 1 were taken by AR. Drone 2. Since the AR drone 2.0 has liquid lens, while flying there were significant distortions and deformations on the converted pictures of the movies. Passive remote sensing (RS applications based on image matching and Epipolar lines such as Agisoft PhotoScan have been tested to create the point clouds and mesh along with 3D models and textures. As the result was not acceptable (See Figure 2, the previous Dynamic Pulse Function based on Ruby programming language were enhanced and utilized to create the 3D models automatically in LoD3. The accuracy of the final 3D model is almost 10 to 20 cm. After rectification and parallel projection of the photos based on some tie points and targets, all the parameters were measured and utilized as an input to the system to create the 3D model automatically in LoD3 in a very high accuracy.

  14. Using Affordable Data Capturing Devices for Automatic 3d City Modelling

    Science.gov (United States)

    Alizadehashrafi, B.; Abdul-Rahman, A.

    2017-11-01

    In this research project, many movies from UTM Kolej 9, Skudai, Johor Bahru (See Figure 1) were taken by AR. Drone 2. Since the AR drone 2.0 has liquid lens, while flying there were significant distortions and deformations on the converted pictures of the movies. Passive remote sensing (RS) applications based on image matching and Epipolar lines such as Agisoft PhotoScan have been tested to create the point clouds and mesh along with 3D models and textures. As the result was not acceptable (See Figure 2), the previous Dynamic Pulse Function based on Ruby programming language were enhanced and utilized to create the 3D models automatically in LoD3. The accuracy of the final 3D model is almost 10 to 20 cm. After rectification and parallel projection of the photos based on some tie points and targets, all the parameters were measured and utilized as an input to the system to create the 3D model automatically in LoD3 in a very high accuracy.

  15. Automatic imitation effects are influenced by experience of synchronous action in children.

    Science.gov (United States)

    O'Sullivan, Eoin P; Bijvoet-van den Berg, Simone; Caldwell, Christine A

    2018-07-01

    By their fourth year of life, children are expert imitators, but it is unclear how this ability develops. One approach suggests that certain types of experience might forge associations between the sensory and motor representations of an action that may facilitate imitation at a later time. Sensorimotor experience of this sort may occur when an infant's action is imitated by a caregiver or when socially synchronous action occurs. This learning approach, therefore, predicts that the strength of sensory-motor associations should depend on the frequency and quality of previous experience. Here, we tested this prediction by examining automatic imitation, that is, the tendency of an action stimulus to facilitate the performance of that action and interfere with the performance of an incompatible action. We required children (aged between 3 years 8 months and 7 years 11 months) to respond to actions performed by an experimenter (e.g., two hands clapping) with both compatible actions (i.e., two hands clapping) and incompatible actions (i.e., two hands waving) at different stages in the experimental procedure. As predicted by a learning account, actions thought to be performed in synchrony (i.e., clapping/waving) produced stronger automatic imitation effects when compared with actions where previous sensorimotor experience is likely to be more limited (e.g., pointing/hand closing). Furthermore, these automatic imitation effects were not found to vary with age, with both compatible and incompatible responses quickening with age. These findings suggest a role for sensorimotor experience in the development of imitative ability. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Automatic construction of a recurrent neural network based classifier for vehicle passage detection

    Science.gov (United States)

    Burnaev, Evgeny; Koptelov, Ivan; Novikov, German; Khanipov, Timur

    2017-03-01

    Recurrent Neural Networks (RNNs) are extensively used for time-series modeling and prediction. We propose an approach for automatic construction of a binary classifier based on Long Short-Term Memory RNNs (LSTM-RNNs) for detection of a vehicle passage through a checkpoint. As an input to the classifier we use multidimensional signals of various sensors that are installed on the checkpoint. Obtained results demonstrate that the previous approach to handcrafting a classifier, consisting of a set of deterministic rules, can be successfully replaced by an automatic RNN training on an appropriately labelled data.

  17. Automatic design of decision-tree induction algorithms tailored to flexible-receptor docking data.

    Science.gov (United States)

    Barros, Rodrigo C; Winck, Ana T; Machado, Karina S; Basgalupp, Márcio P; de Carvalho, André C P L F; Ruiz, Duncan D; de Souza, Osmar Norberto

    2012-11-21

    This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor.

  18. Cognitive-Affective Dimensions of Female Orgasm: The Role of Automatic Thoughts and Affect During Sexual Activity.

    Science.gov (United States)

    Tavares, Inês M; Laan, Ellen T M; Nobre, Pedro J

    2017-06-01

    Cognitive-affective factors contribute to female sexual dysfunctions, defined as clinically significant difficulties in the ability to respond sexually or to experience sexual pleasure. Automatic thoughts and affect presented during sexual activity are acknowledged as maintenance factors for these difficulties. However, there is a lack of studies on the influence of these cognitive-affective dimensions regarding female orgasm. To assess the role of automatic thoughts and affect during sexual activity in predicting female orgasm occurrence and to investigate the mediator role of these variables in the relation between sexual activity and orgasm occurrence. Nine hundred twenty-six sexually active heterosexual premenopausal women reported on frequency of sexual activities and frequency of orgasm occurrence, cognitive factors, and social desirability. Participants completed the Sexual Modes Questionnaire-Automatic Thoughts Subscale, the Positive and Negative Affect Schedule, and the Socially Desirable Response Set. Multiple linear regressions and mediation analyses were performed, controlling for the effect of covariates such as social desirability, sociodemographic and medical characteristics, and relationship factors. The main outcome measurement was orgasm frequency as predicted and mediated by automatic thoughts and affect experienced during sexual activities. The presence of failure thoughts and lack of erotic thoughts during sexual activity significantly and negatively predicted female orgasm, whereas positive affect experienced during sexual activity significantly and positively predicted female orgasm. Moreover, negative automatic thoughts and positive affect during sexual activity were found to mediate the relation between sexual activity and female orgasm occurrence. These data suggest that the cognitive aspects of sexual involvement are critical to enhancing female orgasm experience and can aid the development of strategies that contemplate the central role

  19. High-Speed Automatic Microscopy for Real Time Tracks Reconstruction in Nuclear Emulsion

    Science.gov (United States)

    D'Ambrosio, N.

    2006-06-01

    The Oscillation Project with Emulsion-tRacking Apparatus (OPERA) experiment will use a massive nuclear emulsion detector to search for /spl nu//sub /spl mu///spl rarr//spl nu//sub /spl tau// oscillation by identifying /spl tau/ leptons through the direct detection of their decay topology. The feasibility of experiments using a large mass emulsion detector is linked to the impressive progress under way in the development of automatic emulsion analysis. A new generation of scanning systems requires the development of fast automatic microscopes for emulsion scanning and image analysis to reconstruct tracks of elementary particles. The paper presents the European Scanning System (ESS) developed in the framework of OPERA collaboration.

  20. Automatic three-dimensional model for protontherapy of the eye: Preliminary results

    International Nuclear Information System (INIS)

    Bondiau, Pierre-Yves; Malandain, Gregoire; Chauvel, Pierre; Peyrade, Frederique; Courdi, Adel; Iborra, Nicole; Caujolle, Jean-Pierre; Gastaud, Pierre

    2003-01-01

    Recently, radiotherapy possibilities have been dramatically increased by software and hardware developments. Improvements in medical imaging devices have increased the importance of three-dimensional (3D) images as the complete examination of these data by a physician is not possible. Computer techniques are needed to present only the pertinent information for clinical applications. We describe a technique for an automatic 3D reconstruction of the eye and CT scan merging with fundus photographs (retinography). The final result is a 'virtual eye' to guide ocular tumor protontherapy. First, we make specific software to automatically detect the position of the eyeball, the optical nerve, and the lens in the CT scan. We obtain a 3D eye reconstruction using this automatic method. Second, we describe the retinography and demonstrate the projection of this modality. Then we combine retinography with a reconstructed eye, using a CT scan to get a virtual eye. The result is a computer 3D scene rendering a virtual eye into a skull reconstruction. The virtual eye can be useful for the simulation, the planning, and the control of ocular tumor protontherapy. It can be adapted to treatment planning to automatically detect eye and organs at risk position. It should be highlighted that all the image processing is fully automatic to allow the reproduction of results, this is a useful property to conduct a consistent clinical validation. The automatic localization of the organ at risk in a CT scan or an MRI by automatic software could be of great interest for radiotherapy in the future for comparison of one patient at different times, the comparison of different treatments centers, the possibility of pooling results of different treatments centers, the automatic generation of doses-volumes histograms, the comparison between different treatment planning for the same patient and the comparison between different patients at the same time. It will also be less time consuming

  1. Predicting automatic speech recognition performance over communication channels from instrumental speech quality and intelligibility scores

    NARCIS (Netherlands)

    Gallardo, L.F.; Möller, S.; Beerends, J.

    2017-01-01

    The performance of automatic speech recognition based on coded-decoded speech heavily depends on the quality of the transmitted signals, determined by channel impairments. This paper examines relationships between speech recognition performance and measurements of speech quality and intelligibility

  2. Automatic Tortuosity-Based Retinopathy of Prematurity Screening System

    Science.gov (United States)

    Sukkaew, Lassada; Uyyanonvara, Bunyarit; Makhanov, Stanislav S.; Barman, Sarah; Pangputhipong, Pannet

    Retinopathy of Prematurity (ROP) is an infant disease characterized by increased dilation and tortuosity of the retinal blood vessels. Automatic tortuosity evaluation from retinal digital images is very useful to facilitate an ophthalmologist in the ROP screening and to prevent childhood blindness. This paper proposes a method to automatically classify the image into tortuous and non-tortuous. The process imitates expert ophthalmologists' screening by searching for clearly tortuous vessel segments. First, a skeleton of the retinal blood vessels is extracted from the original infant retinal image using a series of morphological operators. Next, we propose to partition the blood vessels recursively using an adaptive linear interpolation scheme. Finally, the tortuosity is calculated based on the curvature of the resulting vessel segments. The retinal images are then classified into two classes using segments characterized by the highest tortuosity. For an optimal set of training parameters the prediction is as high as 100%.

  3. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  4. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  5. Design and Fabrication of Automatic Glass Cutting Machine

    Science.gov (United States)

    Veena, T. R.; Kadadevaramath, R. S.; Nagaraj, P. M.; Madhusudhan, S. V.

    2016-09-01

    This paper deals with the design and fabrication of the automatic glass or mirror cutting machine. In order to increase the accuracy of cut and production rate; and decrease the production time and accidents caused due to manual cutting of mirror or glass, this project aims at development of an automatic machine which uses a programmable logic controller (PLC) for controlling the movement of the conveyer and also to control the pneumatic circuit. In this machine, the work of the operator is to load and unload the mirror. The cutter used in this machine is carbide wheel with its cutting edge ground to a V-shaped profile. The PLC controls the pneumatic cylinder and intern actuates the cutter along the glass, a fracture layer is formed causing a mark to be formed below the fracture layer and a crack to be formed below the rib mark. The machine elements are designed using CATIA V5R20 and pneumatic circuit are designed using FESTO FLUID SIM software.

  6. The Rationalization of Automatic Units for HPDC Technology

    Directory of Open Access Journals (Sweden)

    A. Herman

    2012-04-01

    Full Text Available The paper deals with problem of optimal used automatic workplace for HPDC technology - mainly from aspects of operations sequence, efficiency of work cycle and planning of using and servicing of HPDC casting machine. Presented are possible ways to analyse automatic units for HPDC. The experimental part was focused on the rationalization of the current work cycle time for die casting of aluminium alloy. The working place was described in detail in the project. The measurements were carried out in detail with the help of charts and graphs mapped cycle of casting workplace. Other parameters and settings have been identified.The proposals for improvements were made after the first measurements and these improvements were subsequently verified. The main actions were mainly software modifications of casting center. It is for the reason that today's sophisticated workplaces have the option of a relatively wide range of modifications without any physical harm to machines themselves. It is possible to change settings or unlock some unsatisfactory parameters.

  7. Shape: automatic conformation prediction of carbohydrates using a genetic algorithm

    Directory of Open Access Journals (Sweden)

    Rosen Jimmy

    2009-09-01

    Full Text Available Abstract Background Detailed experimental three dimensional structures of carbohydrates are often difficult to acquire. Molecular modelling and computational conformation prediction are therefore commonly used tools for three dimensional structure studies. Modelling procedures generally require significant training and computing resources, which is often impractical for most experimental chemists and biologists. Shape has been developed to improve the availability of modelling in this field. Results The Shape software package has been developed for simplicity of use and conformation prediction performance. A trivial user interface coupled to an efficient genetic algorithm conformation search makes it a powerful tool for automated modelling. Carbohydrates up to a few hundred atoms in size can be investigated on common computer hardware. It has been shown to perform well for the prediction of over four hundred bioactive oligosaccharides, as well as compare favourably with previously published studies on carbohydrate conformation prediction. Conclusion The Shape fully automated conformation prediction can be used by scientists who lack significant modelling training, and performs well on computing hardware such as laptops and desktops. It can also be deployed on computer clusters for increased capacity. The prediction accuracy under the default settings is good, as it agrees well with experimental data and previously published conformation prediction studies. This software is available both as open source and under commercial licenses.

  8. Automatic sprinkler system performance and reliability in United States Department of Energy Facilities, 1952 to 1980

    International Nuclear Information System (INIS)

    1982-06-01

    The automatic sprinkler system experiences of the United States Department of Energy and its predecessor agencies are analyzed. Based on accident and incident files in the Office of Operational Safety and on supplementary responses, 587 incidents including over 100 fires are analyzed. Tables and figures, with supplementary narratives discuss fire experience by various categories such as number of heads operating, type of system, dollar losses, failures, extinguished vs. controlled, and types of sprinkler heads. Use is made of extreme value projections and frequency-severity plots to compare past experience and predict future experience. Non-fire incidents are analyzed in a similar manner by cause, system types and failure types. Discussion of no-loss incidents and non-fire protection water systems is included. The author's conclusions and recommendations and appendices listing survey methodology, major incidents, and a bibliography are included

  9. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  10. Man-system interface based on automatic speech recognition: integration to a virtual control desk

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A.; Pereira, Claudio M.N.A.; Aghina, Mauricio Alves C., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b, E-mail: cmnap@ien.gov.b, E-mail: mag@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Nomiya, Diogo V., E-mail: diogonomiya@gmail.co [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil)

    2009-07-01

    This work reports the implementation of a man-system interface based on automatic speech recognition, and its integration to a virtual nuclear power plant control desk. The later is aimed to reproduce a real control desk using virtual reality technology, for operator training and ergonomic evaluation purpose. An automatic speech recognition system was developed to serve as a new interface with users, substituting computer keyboard and mouse. They can operate this virtual control desk in front of a computer monitor or a projection screen through spoken commands. The automatic speech recognition interface developed is based on a well-known signal processing technique named cepstral analysis, and on artificial neural networks. The speech recognition interface is described, along with its integration with the virtual control desk, and results are presented. (author)

  11. Man-system interface based on automatic speech recognition: integration to a virtual control desk

    International Nuclear Information System (INIS)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A.; Pereira, Claudio M.N.A.; Aghina, Mauricio Alves C.; Nomiya, Diogo V.

    2009-01-01

    This work reports the implementation of a man-system interface based on automatic speech recognition, and its integration to a virtual nuclear power plant control desk. The later is aimed to reproduce a real control desk using virtual reality technology, for operator training and ergonomic evaluation purpose. An automatic speech recognition system was developed to serve as a new interface with users, substituting computer keyboard and mouse. They can operate this virtual control desk in front of a computer monitor or a projection screen through spoken commands. The automatic speech recognition interface developed is based on a well-known signal processing technique named cepstral analysis, and on artificial neural networks. The speech recognition interface is described, along with its integration with the virtual control desk, and results are presented. (author)

  12. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  13. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  14. Study of an automatic dosing of neptunium in the industrial process of separation neptunium 237-plutonium 238

    International Nuclear Information System (INIS)

    Ros, Pierre

    1973-01-01

    The objective is to study and to adapt a method of automatic dosing of neptunium to the industrial process of separation and purification of plutonium 238, while taking the information quality and economic aspects into account. After a recall of some generalities on the production of plutonium 238, and the process of separation plutonium-neptunium, the author addresses the dosing of neptunium. The adopted measurement technique is spectrophotometry (of neptunium, of neptunium peroxide) which is the most flexible and economic to adapt to automatic control. The author proposes a project of chemical automatic machine, and discusses the complex (stoichiometry, form) and some aspects of neptunium dosing (redox reactions, process control) [fr

  15. Automatic determination of L/H transition times in DIII-D through a collaborative distributed environment

    International Nuclear Information System (INIS)

    Farias, G.; Vega, J.; González, S.; Pereira, A.; Lee, X.; Schissel, D.; Gohil, P.

    2012-01-01

    Highlights: ► An automatic predictor of L/H transition times has been implemented for the DIII-D tokamak. ► The system predicts the transition combining two techniques: a morphological pattern recognition algorithm and a support vector machines multi-layer model. ► The predictor is employed within a collaborative distributed computing environment. The system is trained remotely in the Ciemat computer cluster and operated on the DIII-D site. - Abstract: An automatic predictor of L/H transition times has been implemented for the DIII-D tokamak. The system predicts the transition combining two techniques: A morphological pattern recognition algorithm, which estimates the transition based on the waveform of a Dα emission signal, and a support vector machines multi-layer model, which predicts the L/H transition using a non-parametric model. The predictor is employed within a collaborative distributed computing environment. The system is trained remotely in the Ciemat computer cluster and operated on the DIII-D site.

  16. Automaticity and localisation of concurrents predicts colour area activity in grapheme-colour synaesthesia.

    Science.gov (United States)

    Gould van Praag, Cassandra D; Garfinkel, Sarah; Ward, Jamie; Bor, Daniel; Seth, Anil K

    2016-07-29

    In grapheme-colour synaesthesia (GCS), the presentation of letters or numbers induces an additional 'concurrent' experience of colour. Early functional MRI (fMRI) investigations of GCS reported activation in colour-selective area V4 during the concurrent experience. However, others have failed to replicate this key finding. We reasoned that individual differences in synaesthetic phenomenology might explain this inconsistency in the literature. To test this hypothesis, we examined fMRI BOLD responses in a group of grapheme-colour synaesthetes (n=20) and matched controls (n=20) while characterising the individual phenomenology of the synaesthetes along dimensions of 'automaticity' and 'localisation'. We used an independent functional localiser to identify colour-selective areas in both groups. Activations in these areas were then assessed during achromatic synaesthesia-inducing, and non-inducing conditions; we also explored whole brain activations, where we sought to replicate the existing literature regarding synaesthesia effects. Controls showed no significant activations in the contrast of inducing > non-inducing synaesthetic stimuli, in colour-selective ROIs or at the whole brain level. In the synaesthete group, we correlated activation within colour-selective ROIs with individual differences in phenomenology using the Coloured Letters and Numbers (CLaN) questionnaire which measures, amongst other attributes, the subjective automaticity/attention in synaesthetic concurrents, and their spatial localisation. Supporting our hypothesis, we found significant correlations between individual measures of synaesthetic phenomenology and BOLD responses in colour-selective areas, when contrasting inducing against non-inducing stimuli. Specifically, left-hemisphere colour area responses were stronger for synaesthetes scoring high on phenomenological localisation and automaticity/attention, while right-hemisphere colour area responses showed a relationship with localisation

  17. A Hybrid Instance Selection Using Nearest-Neighbor for Cross-Project Defect Prediction

    Institute of Scientific and Technical Information of China (English)

    Duksan Ryu; Jong-In Jang; Jongmoon Baik; Member; ACM; IEEE

    2015-01-01

    Software defect prediction (SDP) is an active research field in software engineering to identify defect-prone modules. Thanks to SDP, limited testing resources can be effectively allocated to defect-prone modules. Although SDP requires suffcient local data within a company, there are cases where local data are not available, e.g., pilot projects. Companies without local data can employ cross-project defect prediction (CPDP) using external data to build classifiers. The major challenge of CPDP is different distributions between training and test data. To tackle this, instances of source data similar to target data are selected to build classifiers. Software datasets have a class imbalance problem meaning the ratio of defective class to clean class is far low. It usually lowers the performance of classifiers. We propose a Hybrid Instance Selection Using Nearest-Neighbor (HISNN) method that performs a hybrid classification selectively learning local knowledge (via k-nearest neighbor) and global knowledge (via na¨ıve Bayes). Instances having strong local knowledge are identified via nearest-neighbors with the same class label. Previous studies showed low PD (probability of detection) or high PF (probability of false alarm) which is impractical to use. The experimental results show that HISNN produces high overall performance as well as high PD and low PF.

  18. Implementation of a microcontroller-based semi-automatic coagulator.

    Science.gov (United States)

    Chan, K; Kirumira, A; Elkateeb, A

    2001-01-01

    The coagulator is an instrument used in hospitals to detect clot formation as a function of time. Generally, these coagulators are very expensive and therefore not affordable by a doctors' office and small clinics. The objective of this project is to design and implement a low cost semi-automatic coagulator (SAC) prototype. The SAC is capable of assaying up to 12 samples and can perform the following tests: prothrombin time (PT), activated partial thromboplastin time (APTT), and PT/APTT combination. The prototype has been tested successfully.

  19. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  20. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. FURTHER CONSIDERATIONS ON SPREADSHEET-BASED AUTOMATIC TREND LINES

    Directory of Open Access Journals (Sweden)

    DANIEL HOMOCIANU

    2015-12-01

    Full Text Available Most of the nowadays business applications working with data sets allow exports to the spreadsheet format. This fact is related to the experience of common business users with such products and to the possibility to couple what they have with something containing many models, functions and possibilities to process and represent data, by that getting something in dynamics and much more than a simple static less useful report. The purpose of Business Intelligence is to identify clusters, profiles, association rules, decision trees and many other patterns or even behaviours, but also to generate alerts for exceptions, determine trends and make predictions about the future based on historical data. In this context, the paper shows some practical results obtained after testing both the automatic creation of scatter charts and trend lines corresponding to the user’s preferences and the automatic suggesting of the most appropriate trend for the tested data mostly based on the statistical measure of how close they are to the regression function.

  2. GSM Web-Based Centralized Remote Wireless Automatic Controlling and Monitoring of Aquafeeder

    Science.gov (United States)

    Wong, C. L.; Idris, A.; Hasan, Z.

    2016-03-01

    This project is about producing a prototype to feed fishes at fish ponds of remote location with the use of GSM mobile phone. An automatic fish feeder is an electric device that has been designed to give out the right amount of pellets at the designed time. In this project, the automatic feeder designed consists of photovoltaic solar cells that are used to generate electricity and storing it into batteries. Solar charge controllers can be used to determine the rate of which current is drawn and added from the batteries. GSM cellular communication is used to allow user to control from a distance. Commands or instructions are sent to the operating system which in return runs the servomotor and blower by blowing certain amount of fish pallets into the pond to feed the fishes. The duration of the feeding processes is fixed by the user, hence the amount of fish food pallets released are precisely the same for each time. This technology is especially useful for fish farmers where they can remotely feed their fishes.

  3. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  4. First Steps Towards the Automatic Construction of Argument-Diagrams from Real Discussions

    NARCIS (Netherlands)

    Verbree, Daan; Rienks, R.J.; Heylen, Dirk K.J.; Dunne, P.; Bench-Capon, T.J.E.

    This paper presents our efforts to create argument structures from meeting transcripts automatically. We show that unit labels of argument diagrams can be learnt and predicted by a computer with an accuracy of 78,52% and 51,43% on an unbalanced and balanced set respectively. We used a corpus of over

  5. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  6. Diagnosis - Using automatic test equipment and artificial intelligence expert systems

    Science.gov (United States)

    Ramsey, J. E., Jr.

    Three expert systems (ATEOPS, ATEFEXPERS, and ATEFATLAS), which were created to direct automatic test equipment (ATE), are reviewed. The purpose of the project was to develop an expert system to troubleshoot the converter-programmer power supply card for the F-15 aircraft and have that expert system direct the automatic test equipment. Each expert system uses a different knowledge base or inference engine, basing the testing on the circuit schematic, test requirements document, or ATLAS code. Implementing generalized modules allows the expert systems to be used for any different unit under test. Using converted ATLAS to LISP code allows the expert system to direct any ATE using ATLAS. The constraint propagated frame system allows for the expansion of control by creating the ATLAS code, checking the code for good software engineering techniques, directing the ATE, and changing the test sequence as needed (planning).

  7. Automatic and strategic effects in the guidance of attention by working memory representations.

    Science.gov (United States)

    Carlisle, Nancy B; Woodman, Geoffrey F

    2011-06-01

    Theories of visual attention suggest that working memory representations automatically guide attention toward memory-matching objects. Some empirical tests of this prediction have produced results consistent with working memory automatically guiding attention. However, others have shown that individuals can strategically control whether working memory representations guide visual attention. Previous studies have not independently measured automatic and strategic contributions to the interactions between working memory and attention. In this study, we used a classic manipulation of the probability of valid, neutral, and invalid cues to tease apart the nature of such interactions. This framework utilizes measures of reaction time (RT) to quantify the costs and benefits of attending to memory-matching items and infer the relative magnitudes of automatic and strategic effects. We found both costs and benefits even when the memory-matching item was no more likely to be the target than other items, indicating an automatic component of attentional guidance. However, the costs and benefits essentially doubled as the probability of a trial with a valid cue increased from 20% to 80%, demonstrating a potent strategic effect. We also show that the instructions given to participants led to a significant change in guidance distinct from the actual probability of events during the experiment. Together, these findings demonstrate that the influence of working memory representations on attention is driven by both automatic and strategic interactions. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Control System Design for Automatic Cavity Tuning Machines

    Energy Technology Data Exchange (ETDEWEB)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; /Fermilab; Goessel, A.; Iversen, J.; Klinke, D.; /DESY

    2009-05-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  9. Control System Design for Automatic Cavity Tuning Machines

    International Nuclear Information System (INIS)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; Goessel, A.; Iversen, J.; Klinke, D.

    2009-01-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  10. Device for the automatic evaluation of pencil dosimeters

    International Nuclear Information System (INIS)

    Schallopp, B.

    1976-01-01

    In connenction with the automation of radiation protection in nuclear power plants, an automatic reading device has been developed for the direct input of the readings of pencil dosimeters into a computer. Voltage measurements would be simple but are excluded, because the internal electrode of the dosimeter may not be touched, for operational reasons. This paper describes an optical/electronic conversion device in which the reading of the dosimeter is projected onto a Vidicon, scanned, and converted into a digital signal for output to the computer. (orig.) [de

  11. Automatic and controlled processing and the Broad Autism Phenotype.

    Science.gov (United States)

    Camodeca, Amy; Voelker, Sylvia

    2016-01-30

    Research related to verbal fluency in the Broad Autism Phenotype (BAP) is limited and dated, but generally suggests intact abilities in the context of weaknesses in other areas of executive function (Hughes et al., 1999; Wong et al., 2006; Delorme et al., 2007). Controlled processing, the generation of search strategies after initial, automated responses are exhausted (Spat, 2013), has yet to be investigated in the BAP, and may be evidenced in verbal fluency tasks. One hundred twenty-nine participants completed the Delis-Kaplan Executive Function System Verbal Fluency test (D-KEFS; Delis et al., 2001) and the Broad Autism Phenotype Questionnaire (BAPQ; Hurley et al., 2007). The BAP group (n=53) produced significantly fewer total words during the 2nd 15" interval compared to the Non-BAP (n=76) group. Partial correlations indicated similar relations between verbal fluency variables for each group. Regression analyses predicting 2nd 15" interval scores suggested differentiation between controlled and automatic processing skills in both groups. Results suggest adequate automatic processing, but slowed development of controlled processing strategies in the BAP, and provide evidence for similar underlying cognitive constructs for both groups. Controlled processing was predictive of Block Design score for Non-BAP participants, and was predictive of Pragmatic Language score on the BAPQ for BAP participants. These results are similar to past research related to strengths and weaknesses in the BAP, respectively, and suggest that controlled processing strategy use may be required in instances of weak lower-level skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Short-term prediction of local wind conditions

    DEFF Research Database (Denmark)

    Landberg, L.

    2001-01-01

    This paper will describe a system which predicts the expected power output of a number of wind farms. The system is automatic and operates on-line. The paper will quantify the accuracy of the predictions and will also give examples of the performance for specific storm events. An actual...

  13. Development of a chain limber and its measuring automatics; Karsimakoneen ja sen mittausautomatiikan kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Poeytaesaari, E [Eskon Paja, Kinnula (Finland)

    1997-12-01

    A new control system and measuring automatics are developed for a patented chain limber mountable to a farm tractor. The chain limber produces pulp wood and also limbed fuel logs. The project will be carried out in three stages: definition of the control system, development of the control system, and operational testing of the control system and the chain limber. The final stage of the project will be carried out in co-operation with the Work Efficiency Association. (orig.)

  14. Automatic gender detection of dream reports: A promising approach.

    Science.gov (United States)

    Wong, Christina; Amini, Reza; De Koninck, Joseph

    2016-08-01

    A computer program was developed in an attempt to differentiate the dreams of males from females. Hypothesized gender predictors were based on previous literature concerning both dream content and written language features. Dream reports from home-collected dream diaries of 100 male (144 dreams) and 100 female (144 dreams) adolescent Anglophones were matched for equal length. They were first scored with the Hall and Van de Castle (HVDC) scales and quantified using DreamSAT. Two male and two female undergraduate students were asked to read all dreams and predict the dreamer's gender. They averaged a pairwise percent correct gender prediction of 75.8% (κ=0.516), while the Automatic Analysis showed that the computer program's accuracy was 74.5% (κ=0.492), both of which were higher than chance of 50% (κ=0.00). The prediction levels were maintained when dreams containing obvious gender identifiers were eliminated and integration of HVDC scales did not improve prediction. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Predicting the 10-Year Risks of Atherosclerotic Cardiovascular Disease in Chinese Population: The China-PAR Project (Prediction for ASCVD Risk in China).

    Science.gov (United States)

    Yang, Xueli; Li, Jianxin; Hu, Dongsheng; Chen, Jichun; Li, Ying; Huang, Jianfeng; Liu, Xiaoqing; Liu, Fangchao; Cao, Jie; Shen, Chong; Yu, Ling; Lu, Fanghong; Wu, Xianping; Zhao, Liancheng; Wu, Xigui; Gu, Dongfeng

    2016-11-08

    The accurate assessment of individual risk can be of great value to guiding and facilitating the prevention of atherosclerotic cardiovascular disease (ASCVD). However, prediction models in common use were formulated primarily in white populations. The China-PAR project (Prediction for ASCVD Risk in China) is aimed at developing and validating 10-year risk prediction equations for ASCVD from 4 contemporary Chinese cohorts. Two prospective studies followed up together with a unified protocol were used as the derivation cohort to develop 10-year ASCVD risk equations in 21 320 Chinese participants. The external validation was evaluated in 2 independent Chinese cohorts with 14 123 and 70 838 participants. Furthermore, model performance was compared with the Pooled Cohort Equations reported in the American College of Cardiology/American Heart Association guideline. Over 12 years of follow-up in the derivation cohort with 21 320 Chinese participants, 1048 subjects developed a first ASCVD event. Sex-specific equations had C statistics of 0.794 (95% confidence interval, 0.775-0.814) for men and 0.811 (95% confidence interval, 0.787-0.835) for women. The predicted rates were similar to the observed rates, as indicated by a calibration χ 2 of 13.1 for men (P=0.16) and 12.8 for women (P=0.17). Good internal and external validations of our equations were achieved in subsequent analyses. Compared with the Chinese equations, the Pooled Cohort Equations had lower C statistics and much higher calibration χ 2 values in men. Our project developed effective tools with good performance for 10-year ASCVD risk prediction among a Chinese population that will help to improve the primary prevention and management of cardiovascular disease. © 2016 American Heart Association, Inc.

  16. Automatic assessment of average diaphragm motion trajectory from 4DCT images through machine learning.

    Science.gov (United States)

    Li, Guang; Wei, Jie; Huang, Hailiang; Gaebler, Carl Philipp; Yuan, Amy; Deasy, Joseph O

    2015-12-01

    To automatically estimate average diaphragm motion trajectory (ADMT) based on four-dimensional computed tomography (4DCT), facilitating clinical assessment of respiratory motion and motion variation and retrospective motion study. We have developed an effective motion extraction approach and a machine-learning-based algorithm to estimate the ADMT. Eleven patients with 22 sets of 4DCT images (4DCT1 at simulation and 4DCT2 at treatment) were studied. After automatically segmenting the lungs, the differential volume-per-slice (dVPS) curves of the left and right lungs were calculated as a function of slice number for each phase with respective to the full-exhalation. After 5-slice moving average was performed, the discrete cosine transform (DCT) was applied to analyze the dVPS curves in frequency domain. The dimensionality of the spectrum data was reduced by using several lowest frequency coefficients ( f v ) to account for most of the spectrum energy (Σ f v 2 ). Multiple linear regression (MLR) method was then applied to determine the weights of these frequencies by fitting the ground truth-the measured ADMT, which are represented by three pivot points of the diaphragm on each side. The 'leave-one-out' cross validation method was employed to analyze the statistical performance of the prediction results in three image sets: 4DCT1, 4DCT2, and 4DCT1 + 4DCT2. Seven lowest frequencies in DCT domain were found to be sufficient to approximate the patient dVPS curves ( R = 91%-96% in MLR fitting). The mean error in the predicted ADMT using leave-one-out method was 0.3 ± 1.9 mm for the left-side diaphragm and 0.0 ± 1.4 mm for the right-side diaphragm. The prediction error is lower in 4DCT2 than 4DCT1, and is the lowest in 4DCT1 and 4DCT2 combined. This frequency-analysis-based machine learning technique was employed to predict the ADMT automatically with an acceptable error (0.2 ± 1.6 mm). This volumetric approach is not affected by the presence of the lung tumors

  17. Implementation of automatic protection switching in an optical cross connect

    OpenAIRE

    Uy, Jason

    2005-01-01

    Having a reliable network is a hard requirement for Telecommunication companies when deploying new networks. Service providers and enterprise customers lose a lot of money any time an interruption of internet service occurs. The SONETISDH specification specifies several different types of topology that support redundancy. An Automatic Protection Switching (APS) mechanism is specified for each topology to dictate how a network behaves in a failure event. For this project, a software implementa...

  18. Effect of automatic recirculation flow control on the transient response for Lungmen ABWR plant

    Energy Technology Data Exchange (ETDEWEB)

    Tzang, Y.-C., E-mail: yctzang@aec.gov.t [National Tsing Hua University, Department of Engineering and System Science, Hsinchu 30013, Taiwan (China); Chiang, R.-F.; Ferng, Y.-M.; Pei, B.-S. [National Tsing Hua University, Department of Engineering and System Science, Hsinchu 30013, Taiwan (China)

    2009-12-15

    In this study the automatic mode of the recirculation flow control system (RFCS) for the Lungmen ABWR plant has been modeled and incorporated into the basic RETRAN-02 system model. The integrated system model is then used to perform the analyses for the two transients in which the automatic RFCS is involved. The two transients selected are: (1) one reactor internal pump (RIP) trip, and (2) loss of feedwater heating. In general, the integrated system model can predict well the response of key system parameters, including neutron flux, steam dome pressure, heat flux, RIP flow, core inlet flow, feedwater flow, steam flow, and reactor water level. The transients are also analyzed for manual RFCS case, between the automatic RFCS and the manual RFCS cases, comparisons of the transient response for the key system parameter show that the difference of transient response can be clearly identified. Also, the results show that the DELTACPR (delta critical power ratio) for the transients analyzed may not be less limiting for the automatic RFCS case under certain combination of control system settings.

  19. Automatic Control of Reactor Temperature and Power Distribution for a Daily Load following Operation

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Keuk Jong; Kim, Han Gon [Korea Hydro and Nuclear Power Institute, Daejeon (Korea, Republic of)

    2010-10-15

    An automatic control method of reactor power and power distribution was developed for a daily load following operation of APR1400. This method used a model predictive control (MPC) methodology having second-order plant data. And it utilized a reactor power ratio and axial shape index as control variables. However, the reactor regulating system of APR1400 is operated by the difference between the average temperature of the reactor core and the reference temperature, which is proportional to the turbine load. Thus, this paper reports on the model predictive control methodology using fourth-order plant data and a reactor temperature instead of the reactor power shape. The purpose of this study is to develop a revised automatic controller and analyze the behavior of the nuclear reactor temperature (Tavg) and the axial shape index (ASI) using the MPC method during a daily load following operation

  20. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  1. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    Energy Technology Data Exchange (ETDEWEB)

    Gervasio, Vivianaluxa [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kim, Dong-Sang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kruger, Albert A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2018-02-19

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints (Vienna et al. 2016), and uncertainty descriptions on projected Hanford glass mass. The maximum allowable WOL was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of IHLW glass when no uncertainties were taken into accound. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increase in estimated glass mass 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). ILAW mass was predicted to be 282,350 MT without uncertainty and with weaste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MTG. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.

  2. Schools water efficiency and awareness project

    African Journals Online (AJOL)

    driniev

    2002-04-23

    Apr 23, 2002 ... Schools Water Efficiency Project in February 2003, which supports several of the City's Water Demand ... of automatic flushing urinals (AFUs) alone in schools can save up .... to go back into the bag as the cistern is filling.

  3. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    Science.gov (United States)

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  4. An application of artificial intelligence to automatic telescopes

    Science.gov (United States)

    Swanson, Keith; Drummond, Mark; Bresina, John

    1992-01-01

    Automatic Photoelectric Telescopes (APT's) allow an astronomer to be removed form the telescope site in both time and space. APT's 'execute' an observation program (a set of observation requests) expressed in an ASCII-based language (ATIS) and collect observation results expressed in this same language. The observation program is currently constructed by a Principal Astronomer from the requests of multiple users; the execution is currently controlled by a simple heuristic dispatch scheduler. Research aimed at improving the use of APT's is being carried out by the Entropy Reduction Engine (ERE) project at NASA Ames. The overall goal of the ERE project is the study and construction of systems that integrate planning, scheduling, and control. This paper discusses the application of some ERE technical results to the improvement of both the scheduling and the operation of APT's.

  5. Automatic testing in the integration phase of mobile work machine (TINAT) - MASIT31

    Energy Technology Data Exchange (ETDEWEB)

    Multanen, P.; Hyvoenen, M. (Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland)); Ellman, A. (Tampere University of Technology, Department of Mechanics and Design, Tampere (Finland)); Rantala, S.; Alanen, J. (VTT Technical Research Centre of Finland, Espoo (Finland))

    2008-07-01

    Abstract The performance and reliability of mobile work machines are significantly affected by control systems of machines and their characteristics. Currently the testing of control systems and verification of their properties is often carried out just in the integration phase of controls and mechanical structure of machine. This is very time consuming, requires a lot of test personnel and is not extensive enough. In TINAT project a test concept will be developed for the testing of entire control systems of work machines without real the mechanical structures of the machines by utilizing modelling and real-time hardware-in-the-loop simulation. The simulator system enables automatic generation of test scenarios and automatic analysis and reporting of test results. (orig.)

  6. Automatic prediction of rheumatoid arthritis disease activity from the electronic medical records.

    Directory of Open Access Journals (Sweden)

    Chen Lin

    Full Text Available We aimed to mine the data in the Electronic Medical Record to automatically discover patients' Rheumatoid Arthritis disease activity at discrete rheumatology clinic visits. We cast the problem as a document classification task where the feature space includes concepts from the clinical narrative and lab values as stored in the Electronic Medical Record.The Training Set consisted of 2792 clinical notes and associated lab values. Test Set 1 included 1749 clinical notes and associated lab values. Test Set 2 included 344 clinical notes for which there were no associated lab values. The Apache clinical Text Analysis and Knowledge Extraction System was used to analyze the text and transform it into informative features to be combined with relevant lab values.Experiments over a range of machine learning algorithms and features were conducted. The best performing combination was linear kernel Support Vector Machines with Unified Medical Language System Concept Unique Identifier features with feature selection and lab values. The Area Under the Receiver Operating Characteristic Curve (AUC is 0.831 (σ = 0.0317, statistically significant as compared to two baselines (AUC = 0.758, σ = 0.0291. Algorithms demonstrated superior performance on cases clinically defined as extreme categories of disease activity (Remission and High compared to those defined as intermediate categories (Moderate and Low and included laboratory data on inflammatory markers.Automatic Rheumatoid Arthritis disease activity discovery from Electronic Medical Record data is a learnable task approximating human performance. As a result, this approach might have several research applications, such as the identification of patients for genome-wide pharmacogenetic studies that require large sample sizes with precise definitions of disease activity and response to therapies.

  7. Human Activity Recognition in AAL Environments Using Random Projections

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2016-01-01

    Full Text Available Automatic human activity recognition systems aim to capture the state of the user and its environment by exploiting heterogeneous sensors attached to the subject’s body and permit continuous monitoring of numerous physiological signals reflecting the state of human actions. Successful identification of human activities can be immensely useful in healthcare applications for Ambient Assisted Living (AAL, for automatic and intelligent activity monitoring systems developed for elderly and disabled people. In this paper, we propose the method for activity recognition and subject identification based on random projections from high-dimensional feature space to low-dimensional projection space, where the classes are separated using the Jaccard distance between probability density functions of projected data. Two HAR domain tasks are considered: activity identification and subject identification. The experimental results using the proposed method with Human Activity Dataset (HAD data are presented.

  8. Human Activity Recognition in AAL Environments Using Random Projections.

    Science.gov (United States)

    Damaševičius, Robertas; Vasiljevas, Mindaugas; Šalkevičius, Justas; Woźniak, Marcin

    2016-01-01

    Automatic human activity recognition systems aim to capture the state of the user and its environment by exploiting heterogeneous sensors attached to the subject's body and permit continuous monitoring of numerous physiological signals reflecting the state of human actions. Successful identification of human activities can be immensely useful in healthcare applications for Ambient Assisted Living (AAL), for automatic and intelligent activity monitoring systems developed for elderly and disabled people. In this paper, we propose the method for activity recognition and subject identification based on random projections from high-dimensional feature space to low-dimensional projection space, where the classes are separated using the Jaccard distance between probability density functions of projected data. Two HAR domain tasks are considered: activity identification and subject identification. The experimental results using the proposed method with Human Activity Dataset (HAD) data are presented.

  9. Semi-supervised learning based probabilistic latent semantic analysis for automatic image annotation

    Institute of Scientific and Technical Information of China (English)

    Tian Dongping

    2017-01-01

    In recent years, multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas, especially for automatic image annotation, whose purpose is to provide an efficient and effective searching environment for users to query their images more easily.In this paper, a semi-supervised learning based probabilistic latent semantic analysis ( PL-SA) model for automatic image annotation is presenred.Since it' s often hard to obtain or create la-beled images in large quantities while unlabeled ones are easier to collect, a transductive support vector machine ( TSVM) is exploited to enhance the quality of the training image data.Then, differ-ent image features with different magnitudes will result in different performance for automatic image annotation.To this end, a Gaussian normalization method is utilized to normalize different features extracted from effective image regions segmented by the normalized cuts algorithm so as to reserve the intrinsic content of images as complete as possible.Finally, a PLSA model with asymmetric mo-dalities is constructed based on the expectation maximization( EM) algorithm to predict a candidate set of annotations with confidence scores.Extensive experiments on the general-purpose Corel5k dataset demonstrate that the proposed model can significantly improve performance of traditional PL-SA for the task of automatic image annotation.

  10. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models

    Science.gov (United States)

    Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.

    2012-04-01

    The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation

  12. Mizunami Underground Research Laboratory project. A project on research stage of investigating prediction from ground surface. Project report at fiscal year of 2000 to 2004

    International Nuclear Information System (INIS)

    2000-04-01

    This was a detailed plan after fiscal year 2000 on the first stage of the Research stage at investigating prediction from ground surface' in three researches carried out at the Mizunami Underground Research Laboratory (MIU) according to the 'Basic plan on research of underground science at MIU', based on progress of investigation and research before fiscal year 1999. This project contains following three items as its general targets; establishment of general investigating techniques for geological environment, collection of informations on deep underground environment, and development on foundation of engineering technology at super-deep underground. And, targets at investigating prediction stage from ground surface contain acquisition of geological environment data through investigations from ground surface to predict changes of the environment accompanied with underground geological environment and construction of experimental tunnel, to determine evaluating method on prediction results, and to determine plannings of an investigating stage accompanied with excavation of the tunnel by carrying out detail design of the tunnel. Here were introduced about results and problems on the investigation of the first phase, the integration of investigating results, and the investigation and researches on geology/geological structure, hydrology and geochemistry of groundwater, mechanical properties of rocks, and the mass transfer. (G.K.)

  13. A web based semi automatic frame work for astrobiological researches

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Astrobiology addresses the possibility of extraterrestrial life and explores measures towards its recognition. Researches in this context are founded upon the premise that indicators of life encountered in space will be recognizable. However, effective recognition can be accomplished through a universal adaptation of life signatures without restricting solely to those attributes that represent local solutions to the challenges of survival. The life indicators should be modelled with reference to temporal and environmental variations specific to each planet and time. In this paper, we investigate a semi-automatic open source frame work for the accurate detection and interpretation of life signatures by facilitating public participation, in a similar way as adopted by SETI@home project. The involvement of public in identifying patterns can bring a thrust to the mission and is implemented using semi-automatic framework. Different advanced intelligent methodologies may augment the integration of this human machine analysis. Automatic and manual evaluations along with dynamic learning strategy have been adopted to provide accurate results. The system also helps to provide a deep public understanding about space agency’s works and facilitate a mass involvement in the astrobiological studies. It will surely help to motivate young eager minds to pursue a career in this field.

  14. Automatic Detection of Storm Damages Using High-Altitude Photogrammetric Imaging

    Science.gov (United States)

    Litkey, P.; Nurminen, K.; Honkavaara, E.

    2013-05-01

    The risks of storms that cause damage in forests are increasing due to climate change. Quickly detecting fallen trees, assessing the amount of fallen trees and efficiently collecting them are of great importance for economic and environmental reasons. Visually detecting and delineating storm damage is a laborious and error-prone process; thus, it is important to develop cost-efficient and highly automated methods. Objective of our research project is to investigate and develop a reliable and efficient method for automatic storm damage detection, which is based on airborne imagery that is collected after a storm. The requirements for the method are the before-storm and after-storm surface models. A difference surface is calculated using two DSMs and the locations where significant changes have appeared are automatically detected. In our previous research we used four-year old airborne laser scanning surface model as the before-storm surface. The after-storm DSM was provided from the photogrammetric images using the Next Generation Automatic Terrain Extraction (NGATE) algorithm of Socet Set software. We obtained 100% accuracy in detection of major storm damages. In this investigation we will further evaluate the sensitivity of the storm-damage detection process. We will investigate the potential of national airborne photography, that is collected at no-leaf season, to automatically produce a before-storm DSM using image matching. We will also compare impact of the terrain extraction algorithm to the results. Our results will also promote the potential of national open source data sets in the management of natural disasters.

  15. An Approach for Implementation of Project Management Information Systems

    Science.gov (United States)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  16. Automatic correspondence detection in mammogram and breast tomosynthesis images

    Science.gov (United States)

    Ehrhardt, Jan; Krüger, Julia; Bischof, Arpad; Barkhausen, Jörg; Handels, Heinz

    2012-02-01

    Two-dimensional mammography is the major imaging modality in breast cancer detection. A disadvantage of mammography is the projective nature of this imaging technique. Tomosynthesis is an attractive modality with the potential to combine the high contrast and high resolution of digital mammography with the advantages of 3D imaging. In order to facilitate diagnostics and treatment in the current clinical work-flow, correspondences between tomosynthesis images and previous mammographic exams of the same women have to be determined. In this paper, we propose a method to detect correspondences in 2D mammograms and 3D tomosynthesis images automatically. In general, this 2D/3D correspondence problem is ill-posed, because a point in the 2D mammogram corresponds to a line in the 3D tomosynthesis image. The goal of our method is to detect the "most probable" 3D position in the tomosynthesis images corresponding to a selected point in the 2D mammogram. We present two alternative approaches to solve this 2D/3D correspondence problem: a 2D/3D registration method and a 2D/2D mapping between mammogram and tomosynthesis projection images with a following back projection. The advantages and limitations of both approaches are discussed and the performance of the methods is evaluated qualitatively and quantitatively using a software phantom and clinical breast image data. Although the proposed 2D/3D registration method can compensate for moderate breast deformations caused by different breast compressions, this approach is not suitable for clinical tomosynthesis data due to the limited resolution and blurring effects perpendicular to the direction of projection. The quantitative results show that the proposed 2D/2D mapping method is capable of detecting corresponding positions in mammograms and tomosynthesis images automatically for 61 out of 65 landmarks. The proposed method can facilitate diagnosis, visual inspection and comparison of 2D mammograms and 3D tomosynthesis images for

  17. Automatic and controlled attentional orienting in the elderly: A dual-process view of the positivity effect.

    Science.gov (United States)

    Gronchi, G; Righi, S; Pierguidi, L; Giovannelli, F; Murasecco, I; Viggiano, M P

    2018-04-01

    The positivity effect in the elderly consists of an attentional preference for positive information as well as avoidance of negative information. Extant theories predict either that the positivity effect depends on controlled attentional processes (socio-emotional selectivity theory), or on an automatic gating selection mechanism (dynamic integration theory). This study examined the role of automatic and controlled attention in the positivity effect. Two dot-probe tasks (with the duration of the stimuli lasting 100 ms and 500 ms, respectively) were employed to compare the attentional bias of 35 elderly people to that of 35 young adults. The stimuli used were expressive faces displaying neutral, disgusted, fearful, and happy expressions. In comparison to young people, the elderly allocated more attention to happy faces at 100 ms and they tended to avoid fearful faces at 500 ms. The findings are not predicted by either theory taken alone, but support the hypothesis that the positivity effect in the elderly is driven by two different processes: an automatic attention bias toward positive stimuli, and a controlled mechanism that diverts attention away from negative stimuli. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. ROBIN: a platform for evaluating automatic target recognition algorithms: I. Overview of the project and presentation of the SAGEM DS competition

    Science.gov (United States)

    Duclos, D.; Lonnoy, J.; Guillerm, Q.; Jurie, F.; Herbin, S.; D'Angelo, E.

    2008-04-01

    The last five years have seen a renewal of Automatic Target Recognition applications, mainly because of the latest advances in machine learning techniques. In this context, large collections of image datasets are essential for training algorithms as well as for their evaluation. Indeed, the recent proliferation of recognition algorithms, generally applied to slightly different problems, make their comparisons through clean evaluation campaigns necessary. The ROBIN project tries to fulfil these two needs by putting unclassified datasets, ground truths, competitions and metrics for the evaluation of ATR algorithms at the disposition of the scientific community. The scope of this project includes single and multi-class generic target detection and generic target recognition, in military and security contexts. From our knowledge, it is the first time that a database of this importance (several hundred thousands of visible and infrared hand annotated images) has been publicly released. Funded by the French Ministry of Defence (DGA) and by the French Ministry of Research, ROBIN is one of the ten Techno-vision projects. Techno-vision is a large and ambitious government initiative for building evaluation means for computer vision technologies, for various application contexts. ROBIN's consortium includes major companies and research centres involved in Computer Vision R&D in the field of defence: Bertin Technologies, CNES, ECA, DGA, EADS, INRIA, ONERA, MBDA, SAGEM, THALES. This paper, which first gives an overview of the whole project, is focused on one of ROBIN's key competitions, the SAGEM Defence Security database. This dataset contains more than eight hundred ground and aerial infrared images of six different vehicles in cluttered scenes including distracters. Two different sets of data are available for each target. The first set includes different views of each vehicle at close range in a "simple" background, and can be used to train algorithms. The second set

  19. Clinical performance of a new hepatitis B surface antigen quantitative assay with automatic dilution

    Directory of Open Access Journals (Sweden)

    Ta-Wei Liu

    2015-01-01

    Full Text Available Hepatitis B virus surface antigen (HBsAg levels reflect disease status and can predict the clinical response to antiviral treatment; however, the emergence of HBsAg mutant strains has become a challenge. The Abbott HBsAg quantification assay provides enhanced detection of HBsAg and HBsAg mutants. We aimed to evaluate the performance of the Abbott HBsAg quantification assay with automatic sample dilutions (shortened as automatic Architect assay, compared with the Abbott HBsAg quantification assay with manual sample dilutions (shortened as manual Architect assay and the Roche HBsAg quantification assay with automatic sample dilutions (shortened as Elecsys. A total of 130 sera samples obtained from 87 hepatitis B virus (HBV-infected patients were collected to assess the correlation between the automatic and manual Architect assays. Among the 87 patients, 41 provided 42 sera samples to confirm the linearity and reproducibility of the automatic Architect assay, and find out the correlation among the Elecsys and two Architect assays. The coefficients of variation (0.44–9.53% and R2 = 0.996–1, which were both determined using values obtained from the automatic Architect assay, showed good reproducibility and linearity. Results of the two Architect assays demonstrated a feasible correlation (n = 130 samples; R = 0.898, p  0.93 in all cases. In conclusion, the correlation between the automatic and manual dilution Architect assays was feasible, particularly in the HBeAg-negative and low DNA groups. With lower labor costs and less human error than the manual version, the Abbott automatic dilution Architect assay provided a good clinical performance with regard to the HBsAg levels.

  20. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    Energy Technology Data Exchange (ETDEWEB)

    Gervasio, V.; Kim, D. S.; Vienna, J. D.; Kruger, A. A.

    2018-03-08

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints (Vienna et al. 2016), and uncertainty descriptions on projected Hanford glass mass. The maximum allowable waste oxide loading (WOL) was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of immobilized high-level waste (IHLW) glass when no uncertainties were taken into account. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increase in estimated glass mass of 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). The immobilized low-activity waste (ILAW) mass was predicted to be 282,350 MT without uncertainty and with waste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MT. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.

  1. Predicting human activities in sequences of actions in RGB-D videos

    Science.gov (United States)

    Jardim, David; Nunes, Luís.; Dias, Miguel

    2017-03-01

    In our daily activities we perform prediction or anticipation when interacting with other humans or with objects. Prediction of human activity made by computers has several potential applications: surveillance systems, human computer interfaces, sports video analysis, human-robot-collaboration, games and health-care. We propose a system capable of recognizing and predicting human actions using supervised classifiers trained with automatically labeled data evaluated in our human activity RGB-D dataset (recorded with a Kinect sensor) and using only the position of the main skeleton joints to extract features. Using conditional random fields (CRFs) to model the sequential nature of actions in a sequence has been used before, but where other approaches try to predict an outcome or anticipate ahead in time (seconds), we try to predict what will be the next action of a subject. Our results show an activity prediction accuracy of 89.9% using an automatically labeled dataset.

  2. Predicting epileptic seizures in advance.

    Directory of Open Access Journals (Sweden)

    Negin Moghim

    Full Text Available Epilepsy is the second most common neurological disorder, affecting 0.6-0.8% of the world's population. In this neurological disorder, abnormal activity of the brain causes seizures, the nature of which tend to be sudden. Antiepileptic Drugs (AEDs are used as long-term therapeutic solutions that control the condition. Of those treated with AEDs, 35% become resistant to medication. The unpredictable nature of seizures poses risks for the individual with epilepsy. It is clearly desirable to find more effective ways of preventing seizures for such patients. The automatic detection of oncoming seizures, before their actual onset, can facilitate timely intervention and hence minimize these risks. In addition, advance prediction of seizures can enrich our understanding of the epileptic brain. In this study, drawing on the body of work behind automatic seizure detection and prediction from digitised Invasive Electroencephalography (EEG data, a prediction algorithm, ASPPR (Advance Seizure Prediction via Pre-ictal Relabeling, is described. ASPPR facilitates the learning of predictive models targeted at recognizing patterns in EEG activity that are in a specific time window in advance of a seizure. It then exploits advanced machine learning coupled with the design and selection of appropriate features from EEG signals. Results, from evaluating ASPPR independently on 21 different patients, suggest that seizures for many patients can be predicted up to 20 minutes in advance of their onset. Compared to benchmark performance represented by a mean S1-Score (harmonic mean of Sensitivity and Specificity of 90.6% for predicting seizure onset between 0 and 5 minutes in advance, ASPPR achieves mean S1-Scores of: 96.30% for prediction between 1 and 6 minutes in advance, 96.13% for prediction between 8 and 13 minutes in advance, 94.5% for prediction between 14 and 19 minutes in advance, and 94.2% for prediction between 20 and 25 minutes in advance.

  3. Design and construction of a graphical interface for automatic generation of simulation code GEANT4

    International Nuclear Information System (INIS)

    Driss, Mozher; Bouzaine Ismail

    2007-01-01

    This work is set in the context of the engineering studies final project; it is accomplished in the center of nuclear sciences and technologies in Sidi Thabet. This project is about conceiving and developing a system based on graphical user interface which allows an automatic codes generation for simulation under the GEANT4 engine. This system aims to facilitate the use of GEANT4 by scientific not necessary expert in this engine and to be used in different areas: research, industry and education. The implementation of this project uses Root library and several programming languages such as XML and XSL. (Author). 5 refs

  4. Affective decision-making moderates the effects of automatic associations on alcohol use among drug offenders.

    Science.gov (United States)

    Cappelli, Christopher; Ames, Susan; Shono, Yusuke; Dust, Mark; Stacy, Alan

    2017-09-01

    This study used a dual-process model of cognition in order to investigate the possible influence of automatic and deliberative processes on lifetime alcohol use in a sample of drug offenders. The objective was to determine if automatic/implicit associations in memory can exert an influence over an individual's alcohol use and if decision-making ability could potentially modify the influence of these associations. 168 participants completed a battery of cognitive tests measuring implicit alcohol associations in memory (verb generation) as well as their affective decision-making ability (Iowa Gambling Task). Structural equation modeling procedures were used to test the relationship between implicit associations, decision-making, and lifetime alcohol use. Results revealed that among participants with lower levels of decision-making, implicit alcohol associations more strongly predicted higher lifetime alcohol use. These findings provide further support for the interaction between a specific decision function and its influence over automatic processes in regulating alcohol use behavior in a risky population. Understanding the interaction between automatic associations and decision processes may aid in developing more effective intervention components.

  5. Automatically rating trainee skill at a pediatric laparoscopic suturing task.

    Science.gov (United States)

    Oquendo, Yousi A; Riddle, Elijah W; Hiller, Dennis; Blinman, Thane A; Kuchenbecker, Katherine J

    2018-04-01

    Minimally invasive surgeons must acquire complex technical skills while minimizing patient risk, a challenge that is magnified in pediatric surgery. Trainees need realistic practice with frequent detailed feedback, but human grading is tedious and subjective. We aim to validate a novel motion-tracking system and algorithms that automatically evaluate trainee performance of a pediatric laparoscopic suturing task. Subjects (n = 32) ranging from medical students to fellows performed two trials of intracorporeal suturing in a custom pediatric laparoscopic box trainer after watching a video of ideal performance. The motions of the tools and endoscope were recorded over time using a magnetic sensing system, and both tool grip angles were recorded using handle-mounted flex sensors. An expert rated the 63 trial videos on five domains from the Objective Structured Assessment of Technical Skill (OSATS), yielding summed scores from 5 to 20. Motion data from each trial were processed to calculate 280 features. We used regularized least squares regression to identify the most predictive features from different subsets of the motion data and then built six regression tree models that predict summed OSATS score. Model accuracy was evaluated via leave-one-subject-out cross-validation. The model that used all sensor data streams performed best, achieving 71% accuracy at predicting summed scores within 2 points, 89% accuracy within 4, and a correlation of 0.85 with human ratings. 59% of the rounded average OSATS score predictions were perfect, and 100% were within 1 point. This model employed 87 features, including none based on completion time, 77 from tool tip motion, 3 from tool tip visibility, and 7 from grip angle. Our novel hardware and software automatically rated previously unseen trials with summed OSATS scores that closely match human expert ratings. Such a system facilitates more feedback-intensive surgical training and may yield insights into the fundamental

  6. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  7. Automatic illumination compensation device based on a photoelectrochemical biofuel cell driven by visible light

    Science.gov (United States)

    Yu, You; Han, Yanchao; Xu, Miao; Zhang, Lingling; Dong, Shaojun

    2016-04-01

    Inverted illumination compensation is important in energy-saving projects, artificial photosynthesis and some forms of agriculture, such as hydroponics. However, only a few illumination adjustments based on self-powered biodetectors that quantitatively detect the intensity of visible light have been reported. We constructed an automatic illumination compensation device based on a photoelectrochemical biofuel cell (PBFC) driven by visible light. The PBFC consisted of a glucose dehydrogenase modified bioanode and a p-type semiconductor cuprous oxide photocathode. The PBFC had a high power output of 161.4 μW cm-2 and an open circuit potential that responded rapidly to visible light. It adjusted the amount of illumination inversely irrespective of how the external illumination was changed. This rational design of utilizing PBFCs provides new insights into automatic light adjustable devices and may be of benefit to intelligent applications.Inverted illumination compensation is important in energy-saving projects, artificial photosynthesis and some forms of agriculture, such as hydroponics. However, only a few illumination adjustments based on self-powered biodetectors that quantitatively detect the intensity of visible light have been reported. We constructed an automatic illumination compensation device based on a photoelectrochemical biofuel cell (PBFC) driven by visible light. The PBFC consisted of a glucose dehydrogenase modified bioanode and a p-type semiconductor cuprous oxide photocathode. The PBFC had a high power output of 161.4 μW cm-2 and an open circuit potential that responded rapidly to visible light. It adjusted the amount of illumination inversely irrespective of how the external illumination was changed. This rational design of utilizing PBFCs provides new insights into automatic light adjustable devices and may be of benefit to intelligent applications. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00759g

  8. VP-Nets : Efficient automatic localization of key brain structures in 3D fetal neurosonography.

    Science.gov (United States)

    Huang, Ruobing; Xie, Weidi; Alison Noble, J

    2018-04-23

    Three-dimensional (3D) fetal neurosonography is used clinically to detect cerebral abnormalities and to assess growth in the developing brain. However, manual identification of key brain structures in 3D ultrasound images requires expertise to perform and even then is tedious. Inspired by how sonographers view and interact with volumes during real-time clinical scanning, we propose an efficient automatic method to simultaneously localize multiple brain structures in 3D fetal neurosonography. The proposed View-based Projection Networks (VP-Nets), uses three view-based Convolutional Neural Networks (CNNs), to simplify 3D localizations by directly predicting 2D projections of the key structures onto three anatomical views. While designed for efficient use of data and GPU memory, the proposed VP-Nets allows for full-resolution 3D prediction. We investigated parameters that influence the performance of VP-Nets, e.g. depth and number of feature channels. Moreover, we demonstrate that the model can pinpoint the structure in 3D space by visualizing the trained VP-Nets, despite only 2D supervision being provided for a single stream during training. For comparison, we implemented two other baseline solutions based on Random Forest and 3D U-Nets. In the reported experiments, VP-Nets consistently outperformed other methods on localization. To test the importance of loss function, two identical models are trained with binary corss-entropy and dice coefficient loss respectively. Our best VP-Net model achieved prediction center deviation: 1.8 ± 1.4 mm, size difference: 1.9 ± 1.5 mm, and 3D Intersection Over Union (IOU): 63.2 ± 14.7% when compared to the ground truth. To make the whole pipeline intervention free, we also implement a skull-stripping tool using 3D CNN, which achieves high segmentation accuracy. As a result, the proposed processing pipeline takes a raw ultrasound brain image as input, and output a skull-stripped image with five detected key brain

  9. Prediction of optimal deployment projection for transcatheter aortic valve replacement: angiographic 3-dimensional reconstruction of the aortic root versus multidetector computed tomography.

    OpenAIRE

    Binder Ronald K; Leipsic Jonathon; Wood David; Moore Teri; Toggweiler Stefan; Willson Alex; Gurvitch Ronen; Freeman Melanie; Webb John G

    2012-01-01

    BACKGROUND Identifying the optimal fluoroscopic projection of the aortic valve is important for successful transcatheter aortic valve replacement (TAVR). Various imaging modalities including multidetector computed tomography (MDCT) have been proposed for prediction of the optimal deployment projection. We evaluated a method that provides 3 dimensional angiographic reconstructions (3DA) of the aortic root for prediction of the optimal deployment angle and compared it with MDCT. METHODS AND RES...

  10. Automatization Project for the Carl-Zeiss-Jena Coudè Telescope of the Simón Bolívar Planetarium I. The Electro-Mechanic System

    Science.gov (United States)

    Núñez, A.; Maharaj, A.; Muñoz, A. G.

    2009-05-01

    The ``Complejo Científico, Cultural y Turístico Simón Bolívar'' (CCCTSB), located in Maracaibo, Venezuela, lodges the Simón Bolívar Planetarium and an 150 mm aperture, 2250 mm focal length Carl-Zeiss-Jena Coudè refractor telescope. In this work we discuss the schematics for the automatization project of this Telescope, the planned improvements, methodology, engines, micro-controllers, interfaces and the uptodate status of the project. This project is working on the first two levels of the automation pyramid, the sensor -- actuator level and the control or Plant floor level. The Process control level correspond to the software related section. This mean that this project work immediately with the electrical, electronic and mechanical stuffs, and with the assembler micro controller language. All the pc related stuff, like GUI (Graphic user interfaces), remote control, Grid database, and others, correspond to the next two automation pyramid levels. The idea is that little human intervention will be required to manipulate the telescope, only giving a pair of coordinates to ubicate and follow an object on the sky. A set of three servomotors, coupling it with the telescope with a gear box, are going to manipulate right ascension, declination and focus movement. For the dome rotation, a three phase induction motor will be used. For dome aperture/closure it is suggested a DC motor powered with solar panels. All those actuators are controlled by a 8 bits micro-controller, which receive the coordinate imput, the signal from the position sensors and have the PID control algorithm. This algorithm is tuned based on the mathematical model of the telescope electro-mechanical instrumentation.

  11. Compatibility of automatic exposure control with new screen phosphors in diagnostic roentgenography

    International Nuclear Information System (INIS)

    Mulvaney, J.A.

    1982-01-01

    Automatic exposure control systems are used in diagnostic roentgenography to obtain proper film density for a variety of patient examinations and roentgenographic techniques. Most automatic exposure control systems have been designed for use with par speed, calcium tungstate intensifying screens. The use of screens with faster speeds and new phosphor materials has put extreme demands on present systems. The performance of a representative automatic exposure control system is investigated to determine its ability to maintain constant film density over a wide range of x-ray tube voltages and acrylic phantom thicknesses with four different intensifying screen phosphors. The effects of x-ray energy dependence, generator switching time and stored change are investigated. The system is able to maintain film density to within plus or minus 0.2 optical density units for techniques representing adult patients. A single nonadjustable tube voltage compensation circuit is adequate for the four different screen phosphors for x-ray tube voltages above sixty peak kilovolts. For techniques representing pediatric patients at high x-ray tube voltages, excess film density occurs due to stored charge in the transformer and high-voltage cables. An anticipation circuit in the automatic exposure control circuit can be modified to correct for stored charge effects. In a seperate experiment the energy dependence of three different ionization chamber detectors used in automatic exposure control systems is compared directly with the energy dependence of three different screen phosphors. The data on detector sensitivity and screen speed are combined to predict the best tube voltage compensation for each combination of screen and detector

  12. Automatic treatment planning implementation using a database of previously treated patients

    International Nuclear Information System (INIS)

    Moore, J A; Evans, K; Yang, W; Herman, J; McNutt, T

    2014-01-01

    Purpose: Using a database of prior treated patients, it is possible to predict the dose to critical structures for future patients. Automatic treatment planning speeds the planning process by generating a good initial plan from predicted dose values. Methods: A SQL relational database of previously approved treatment plans is populated via an automated export from Pinnacle 3 . This script outputs dose and machine information and selected Regions of Interests as well as its associated Dose-Volume Histogram (DVH) and Overlap Volume Histograms (OVHs) with respect to the target structures. Toxicity information is exported from Mosaiq and added to the database for each patient. The SQL query is designed to ask the system for the lowest achievable dose for a specified region of interest (ROI) for each patient with a given volume of that ROI being as close or closer to the target than the current patient. Results: The additional time needed to calculate OVHs is approximately 1.5 minutes for a typical patient. Database lookup of planning objectives takes approximately 4 seconds. The combined additional time is less than that of a typical single plan optimization (2.5 mins). Conclusions: An automatic treatment planning interface has been successfully used by dosimetrists to quickly produce a number of SBRT pancreas treatment plans. The database can be used to compare dose to individual structures with the toxicity experienced and predict toxicities before planning for future patients.

  13. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  14. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  15. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    International Nuclear Information System (INIS)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship 'Mutsu'. Automation technique was verified by 'Mutsu' plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  16. OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia.

    Science.gov (United States)

    Tcheremenskaia, Olga; Benigni, Romualdo; Nikolova, Ivelina; Jeliazkova, Nina; Escher, Sylvia E; Batke, Monika; Baier, Thomas; Poroikov, Vladimir; Lagunin, Alexey; Rautenberg, Micha; Hardy, Barry

    2012-04-24

    The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). The OpenTox toxicological ontology projects may be accessed via the Open

  17. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  18. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  19. Operational results from a physical power prediction model

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L [Risoe National Lab., Meteorology and Wind Energy Dept., Roskilde (Denmark)

    1999-03-01

    This paper will describe a prediction system which predicts the expected power output of a number of wind farms. The system is automatic and operates on-line. The paper will quantify the accuracy of the predictions and will also give examples of the performance for specific storm events. An actual implementation of the system will be described and the robustness demonstrated. (au) 11 refs.

  20. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  1. A prediction rule for the development of delirium among patients in medical wards: Chi-Square Automatic Interaction Detector (CHAID) decision tree analysis model.

    Science.gov (United States)

    Kobayashi, Daiki; Takahashi, Osamu; Arioka, Hiroko; Koga, Shinichiro; Fukui, Tsuguya

    2013-10-01

    To predict development of delirium among patients in medical wards by a Chi-Square Automatic Interaction Detector (CHAID) decision tree model. This was a retrospective cohort study of all adult patients admitted to medical wards at a large community hospital. The subject patients were randomly assigned to either a derivation or validation group (2:1) by computed random number generation. Baseline data and clinically relevant factors were collected from the electronic chart. Primary outcome was the development of delirium during hospitalization. All potential predictors were included in a forward stepwise logistic regression model. CHAID decision tree analysis was also performed to make another prediction model with the same group of patients. Receiver operating characteristic curves were drawn, and the area under the curves (AUCs) were calculated for both models. In the validation group, these receiver operating characteristic curves and AUCs were calculated based on the rules from derivation. A total of 3,570 patients were admitted: 2,400 patients assigned to the derivation group and 1,170 to the validation group. A total of 91 and 51 patients, respectively, developed delirium. Statistically significant predictors were delirium history, age, underlying malignancy, and activities of daily living impairment in CHAID decision tree model, resulting in six distinctive groups by the level of risk. AUC was 0.82 in derivation and 0.82 in validation with CHAID model and 0.78 in derivation and 0.79 in validation with logistic model. We propose a validated CHAID decision tree prediction model to predict the development of delirium among medical patients. Copyright © 2013 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  2. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  3. Automatic mental heath assistant : monitoring and measuring nonverbal behavior of the crew during long-term missions

    NARCIS (Netherlands)

    Voynarovskaya, N.; Gorbunov, R.D.; Barakova, E.I.; Rauterberg, G.W.M.; Barakova, E.I.; Ruyter, B.; Spink, A.

    2010-01-01

    This paper presents a method for monitoring the mental state of small isolated crews during long-term missions (such as space mission, polar expeditions, submarine crews, meteorological stations, and etc.) The research is done as a part of Automatic Mental Health Assistant (AMHA) project which aims

  4. Online Questionnaires Use with Automatic Feedback for e-Innovation in University Students

    OpenAIRE

    Remesal, Ana; Colomina, Rosa M.; Mauri, Teresa; Rochera, M. José

    2017-01-01

    Technological tools have permeated higher education programs. However, their mere introduction does not guarantee instructional quality. This article presents the results of an innovation project aimed at fostering autonomous learning among students at a Pre-School and Primary Teacher Grade. For one semester all freshmen students used a system for autonomous learning embedded in the institutional online platform (Moodle), which included automatic formative feedback. The system was part of a c...

  5. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  6. Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm

    KAUST Repository

    Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin

    2013-01-01

    A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into p-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. © 2013

  7. Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm

    KAUST Repository

    Abbas, Ahmed

    2013-01-07

    A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into p-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. © 2013

  8. Intelligent Materials Tracking System for Construction Projects Management

    Directory of Open Access Journals (Sweden)

    Narimah Kasim

    2015-05-01

    Full Text Available An essential factor adversely affecting the performance of construction projects is the improper handling of materials during site activities. In addition, paper-based reports are mostly used to record and exchange information related to the material components within the supply chain, which is problematic and inefficient. Generally, technologies (such as wireless systems and RFID are not being adequately used to overcome human errors and are not well integrated with project management systems to make tracking and management of materials easier and faster. Findings from a literature review and surveys showed that there is a lack of positive examples of such tools having been used effectively. Therefore, this research focused on the development of a materials tracking system that integrates RFID-based materials management with resources modelling to improve on-site materials tracking. Rapid prototyping was used to develop the system and testing of the system was carried out to examine the functionality and working appropriately. The proposed system is intended to promote the employment of RFID for automatic materials tracking with integration of resource modelling (Microsoft (R Office Project in the project management system in order to establish which of the tagged components are required resources for certain project tasks. In conclusion, the system provides an automatic and easy tracking method for managing materials during materials delivery and inventory management processes in construction projects.

  9. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  10. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  11. Subjective fear, interference by threat, and fear associations independently predict fear-related behavior in children

    NARCIS (Netherlands)

    Klein, A.M.; Kleinherenbrink, A.V.; Simons, C.; de Gier, E.; Klein, S.; Allart, E.; Bögels, S.M.; Becker, E.S.; Rinck, M.

    2012-01-01

    Background and objectives: Several information-processing models highlight the independent roles of controlled and automatic processes in explaining fearful behavior. Therefore, we investigated whether direct measures of controlled processes and indirect measures of automatic processes predict

  12. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  13. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  14. Operationalization of Prediction, Hindcast, and Evaluation Systems using the Freie Univ Evaluation System Framework (Freva) incl. a Showcase in Decadal Climate Prediction

    Science.gov (United States)

    Kadow, Christopher; Illing, Sebastian; Schartner, Thomas; Ulbrich, Uwe; Cubasch, Ulrich

    2017-04-01

    Operationalization processes are important for Weather and Climate Services. Complex data and work flows need to be combined fast to fulfill the needs of service centers. Standards in data and software formats help in automatic solutions. In this study we show a software solution in between hindcasts, forecasts, and validation to be operationalized. Freva (see below) structures data and evaluation procedures and can easily be monitored. Especially in the development process of operationalized services, Freva supports scientists and project partners. The showcase of the decadal climate prediction project MiKlip (fona-miklip.de) shows such a complex development process. Different predictions, scientists input, tasks, and time evolving adjustments need to be combined to host precise climate informations in a web environment without losing track of its evolution. The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the

  15. Power Factor Improvement Using Automatic Power Factor Compensation (APFC Device for Medical Industries in Malaysia

    Directory of Open Access Journals (Sweden)

    Zaidi Maryam Nabihah

    2018-01-01

    Full Text Available This paper present the project designed to correcting power factor for medical industries in Malaysia automatically. Which with hope to make the cost and energy usage efficient, because the energy source are depleting due to increase in population. Power factor is the ratio of real power and apparent power. This definition is mathematically represented as kW/kVA where kW is active power and kVA is apparent power (active + reactive. Reactive power is the non-working power generated by the magnetic and inductive load to generate magnetic flux. The increase in reactive power increase the apparent power so the power factor will decrease. Low pF will cause the industry to meet high demand thus making it less efficient. The main aim of this project is to increasing the current power factor of medical industries from 0.85 to 0.90. Power factor compensation contribute to reduction in current-dependent losses and increase energy efficiency while expanding the reliability of planning for future energy network. As technology develops, the gradual cost and efficiency penalty should reduce. Therefore, automatic power factor compensation device should become cost-effective and smaller device over time. That is the reason this project is using programmable device as it is a miniature architecture device.

  16. KIT/KPS of Qinshan phase-II and a discussion on integrated information management and automatic control

    International Nuclear Information System (INIS)

    Yan Changhui

    2001-01-01

    Centralized Data Processing and Safety Panel (KIT/KPS) of Qinshan Phase-II power project is described, and the necessity and engineering scheme is presented of integrated information management and automatic control that would achieve in power plant according to the technology scheme and technology trait of KIT/KPS

  17. The Development of Automatic Sequences for the RF and Cryogenic Systems at the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Gurd, Pamela; Casagrande, Fabio; Mccarthy, Michael; Strong, William; Ganni, Venkatarao

    2005-01-01

    Automatic sequences both ease the task of operating a complex machine and ensure procedural consistency. At the Spallation Neutron Source project (SNS), a set of automatic sequences have been developed to perform the start up and shut down of the high power RF systems. Similarly, sequences have been developed to perform backfill, pump down, automatic valve control and energy management in the cryogenic system. The sequences run on Linux soft input-output controllers (IOCs), which are similar to ordinary EPICS (Experimental Physics and Industrial Control System) IOCs in terms of data sharing with other EPICS processes, but which share a Linux processor with other such processors. Each sequence waits for a command from an operator console and starts the corresponding set of instructions, allowing operators to follow the sequences either from an overview screen or from detail screens. We describe each system and our operational experience with it.

  18. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  19. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  20. Automatic Shadow Detection and Removal from a Single Image.

    Science.gov (United States)

    Khan, Salman H; Bennamoun, Mohammed; Sohel, Ferdous; Togneri, Roberto

    2016-03-01

    We present a framework to automatically detect and remove shadows in real world scenes from a single image. Previous works on shadow detection put a lot of effort in designing shadow variant and invariant hand-crafted features. In contrast, our framework automatically learns the most relevant features in a supervised manner using multiple convolutional deep neural networks (ConvNets). The features are learned at the super-pixel level and along the dominant boundaries in the image. The predicted posteriors based on the learned features are fed to a conditional random field model to generate smooth shadow masks. Using the detected shadow masks, we propose a Bayesian formulation to accurately extract shadow matte and subsequently remove shadows. The Bayesian formulation is based on a novel model which accurately models the shadow generation process in the umbra and penumbra regions. The model parameters are efficiently estimated using an iterative optimization procedure. Our proposed framework consistently performed better than the state-of-the-art on all major shadow databases collected under a variety of conditions.

  1. SU-F-R-05: Multidimensional Imaging Radiomics-Geodesics: A Novel Manifold Learning Based Automatic Feature Extraction Method for Diagnostic Prediction in Multiparametric Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, V [The Johns Hopkins University, Computer Science. Baltimore, MD (United States); Jacobs, MA [The Johns Hopkins University School of Medicine, Dept of Radiology and Oncology. Baltimore, MD (United States)

    2016-06-15

    Purpose: Multiparametric radiological imaging is used for diagnosis in patients. Potentially extracting useful features specific to a patient’s pathology would be crucial step towards personalized medicine and assessing treatment options. In order to automatically extract features directly from multiparametric radiological imaging datasets, we developed an advanced unsupervised machine learning algorithm called the multidimensional imaging radiomics-geodesics(MIRaGe). Methods: Seventy-six breast tumor patients underwent 3T MRI breast imaging were used for this study. We tested the MIRaGe algorithm to extract features for classification of breast tumors into benign or malignant. The MRI parameters used were T1-weighted, T2-weighted, dynamic contrast enhanced MR imaging (DCE-MRI) and diffusion weighted imaging(DWI). The MIRaGe algorithm extracted the radiomics-geodesics features (RGFs) from multiparametric MRI datasets. This enable our method to learn the intrinsic manifold representations corresponding to the patients. To determine the informative RGF, a modified Isomap algorithm(t-Isomap) was created for a radiomics-geodesics feature space(tRGFS) to avoid overfitting. Final classification was performed using SVM. The predictive power of the RGFs was tested and validated using k-fold cross validation. Results: The RGFs extracted by the MIRaGe algorithm successfully classified malignant lesions from benign lesions with a sensitivity of 93% and a specificity of 91%. The top 50 RGFs identified as the most predictive by the t-Isomap procedure were consistent with the radiological parameters known to be associated with breast cancer diagnosis and were categorized as kinetic curve characterizing RGFs, wash-in rate characterizing RGFs, wash-out rate characterizing RGFs and morphology characterizing RGFs. Conclusion: In this paper, we developed a novel feature extraction algorithm for multiparametric radiological imaging. The results demonstrated the power of the MIRa

  2. SU-F-R-05: Multidimensional Imaging Radiomics-Geodesics: A Novel Manifold Learning Based Automatic Feature Extraction Method for Diagnostic Prediction in Multiparametric Imaging

    International Nuclear Information System (INIS)

    Parekh, V; Jacobs, MA

    2016-01-01

    Purpose: Multiparametric radiological imaging is used for diagnosis in patients. Potentially extracting useful features specific to a patient’s pathology would be crucial step towards personalized medicine and assessing treatment options. In order to automatically extract features directly from multiparametric radiological imaging datasets, we developed an advanced unsupervised machine learning algorithm called the multidimensional imaging radiomics-geodesics(MIRaGe). Methods: Seventy-six breast tumor patients underwent 3T MRI breast imaging were used for this study. We tested the MIRaGe algorithm to extract features for classification of breast tumors into benign or malignant. The MRI parameters used were T1-weighted, T2-weighted, dynamic contrast enhanced MR imaging (DCE-MRI) and diffusion weighted imaging(DWI). The MIRaGe algorithm extracted the radiomics-geodesics features (RGFs) from multiparametric MRI datasets. This enable our method to learn the intrinsic manifold representations corresponding to the patients. To determine the informative RGF, a modified Isomap algorithm(t-Isomap) was created for a radiomics-geodesics feature space(tRGFS) to avoid overfitting. Final classification was performed using SVM. The predictive power of the RGFs was tested and validated using k-fold cross validation. Results: The RGFs extracted by the MIRaGe algorithm successfully classified malignant lesions from benign lesions with a sensitivity of 93% and a specificity of 91%. The top 50 RGFs identified as the most predictive by the t-Isomap procedure were consistent with the radiological parameters known to be associated with breast cancer diagnosis and were categorized as kinetic curve characterizing RGFs, wash-in rate characterizing RGFs, wash-out rate characterizing RGFs and morphology characterizing RGFs. Conclusion: In this paper, we developed a novel feature extraction algorithm for multiparametric radiological imaging. The results demonstrated the power of the MIRa

  3. ESIP's Earth Science Knowledge Graph (ESKG) Testbed Project: An Automatic Approach to Building Interdisciplinary Earth Science Knowledge Graphs to Improve Data Discovery

    Science.gov (United States)

    McGibbney, L. J.; Jiang, Y.; Burgess, A. B.

    2017-12-01

    Big Earth observation data have been produced, archived and made available online, but discovering the right data in a manner that precisely and efficiently satisfies user needs presents a significant challenge to the Earth Science (ES) community. An emerging trend in information retrieval community is to utilize knowledge graphs to assist users in quickly finding desired information from across knowledge sources. This is particularly prevalent within the fields of social media and complex multimodal information processing to name but a few, however building a domain-specific knowledge graph is labour-intensive and hard to keep up-to-date. In this work, we update our progress on the Earth Science Knowledge Graph (ESKG) project; an ESIP-funded testbed project which provides an automatic approach to building a dynamic knowledge graph for ES to improve interdisciplinary data discovery by leveraging implicit, latent existing knowledge present within across several U.S Federal Agencies e.g. NASA, NOAA and USGS. ESKG strengthens ties between observations and user communities by: 1) developing a knowledge graph derived from various sources e.g. Web pages, Web Services, etc. via natural language processing and knowledge extraction techniques; 2) allowing users to traverse, explore, query, reason and navigate ES data via knowledge graph interaction. ESKG has the potential to revolutionize the way in which ES communities interact with ES data in the open world through the entity, spatial and temporal linkages and characteristics that make it up. This project enables the advancement of ESIP collaboration areas including both Discovery and Semantic Technologies by putting graph information right at our fingertips in an interactive, modern manner and reducing the efforts to constructing ontology. To demonstrate the ESKG concept, we will demonstrate use of our framework across NASA JPL's PO.DAAC, NOAA's Earth Observation Requirements Evaluation System (EORES) and various USGS

  4. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  5. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  6. Development of Lightning Observation Network in the Western Pacific Region for the Intensity Prediction of Severe Weather

    Science.gov (United States)

    Sato, M.; Takahashi, Y.; Yamashita, K.; Kubota, H.; Hamada, J. I.; Momota, E.; Marciano, J. J.

    2017-12-01

    Lightning activity represents the thunderstorm activity, that is, the precipitation and/or updraft intensity and area. Thunderstorm activity is also an important parameter in terms of the energy inputs from the ocean to the atmosphere inside tropical cyclone, which is one of severe weather events. Recent studies suggest that it is possible to predict the maximum wind velocity and minimum pressure near the center of the tropical cyclone by one or two days before if we monitor the lightning activities in the tropical cyclone. Many countries in the western Pacific region suffer from the attack of tropical cyclone (typhoon) and have a strong demand to predict the intensity development of typhoons. Thus, we started developing a new lightning observation system and installing the observation system at Guam, Palau, and Manila in the Philippines from this summer. The lightning observation system consists of a VLF sensor detecting lightning-excited electromagnetic waves in the frequency range of 1-5 kHz, an automatic data-processing unit, solar panels, and batteries. Lightning-excited pulse signals detected by the VLF sensor are automatically analyzed by the data-processing unit, and only the extracted information of the trigger time and pulse amplitude is transmitted to a data server via the 3G data communications. In addition, we are now developing an upgraded lightning and weather observation system, which will be installed at 50 automated weather stations in Metro Manila and 10 radar sites in the Philippines under the 5-year project (SATREPS) scheme. At the presentation, we will show the initial results derived from the lightning observation system in detail and will show the detailed future plan of the SATREPS project.

  7. An Automated Defect Prediction Framework using Genetic Algorithms: A Validation of Empirical Studies

    Directory of Open Access Journals (Sweden)

    Juan Murillo-Morera

    2016-05-01

    Full Text Available Today, it is common for software projects to collect measurement data through development processes. With these data, defect prediction software can try to estimate the defect proneness of a software module, with the objective of assisting and guiding software practitioners. With timely and accurate defect predictions, practitioners can focus their limited testing resources on higher risk areas. This paper reports the results of three empirical studies that uses an automated genetic defect prediction framework. This framework generates and compares different learning schemes (preprocessing + attribute selection + learning algorithms and selects the best one using a genetic algorithm, with the objective to estimate the defect proneness of a software module. The first empirical study is a performance comparison of our framework with the most important framework of the literature. The second empirical study is a performance and runtime comparison between our framework and an exhaustive framework. The third empirical study is a sensitivity analysis. The last empirical study, is our main contribution in this paper. Performance of the software development defect prediction models (using AUC, Area Under the Curve was validated using NASA-MDP and PROMISE data sets. Seventeen data sets from NASA-MDP (13 and PROMISE (4 projects were analyzed running a NxM-fold cross-validation. A genetic algorithm was used to select the components of the learning schemes automatically, and to assess and report the results. Our results reported similar performance between frameworks. Our framework reported better runtime than exhaustive framework. Finally, we reported the best configuration according to sensitivity analysis.

  8. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  9. [Landmark-based automatic registration of serial cross-sectional images of Chinese digital human using Photoshop and Matlab software].

    Science.gov (United States)

    Su, Xiu-yun; Pei, Guo-xian; Yu, Bin; Hu, Yan-ling; Li, Jin; Huang, Qian; Li, Xu; Zhang, Yuan-zhi

    2007-12-01

    This paper describes automatic registration of the serial cross-sectional images of Chinese digital human by projective registration method based on the landmarks using the commercially available software Photoshop and Matlab. During cadaver embedment for acquisition of the Chinese digital human images, 4 rods were placed parallel to the vertical axis of the frozen cadaver to allow orientation. Projective distortion of the rod positions on the cross-sectional images was inevitable due to even slight changes of the relative position of the camera. The original cross-sectional images were first processed using Photoshop software firstly to obtain the images of the orientation rods, and the centroid coordinate of every rod image was acquired with Matlab software. With the average coordinate value of the rods as the fiducial point, two-dimensional projective transformation coefficient of each image was determined. Projective transformation was then carried out and projective distortion from each original serial image was eliminated. The rectified cross-sectional images were again processed using Photoshop to obtain the image of the first orientation rod, the coordinate value of first rod image was calculated using Matlab software, and the cross-sectional images were cut into images of the same size according to the first rod spatial coordinate, to achieve automatic registration of the serial cross-sectional images. sing Photoshop and Matlab softwares, projective transformation can accurately accomplish the image registration for the serial images with simpler calculation processes and easier computer processing.

  10. Detection technology research on the one-way clutch of automatic brake adjuster

    Science.gov (United States)

    Jiang, Wensong; Luo, Zai; Lu, Yi

    2013-10-01

    In this article, we provide a new testing method to evaluate the acceptable quality of the one-way clutch of automatic brake adjuster. To analysis the suitable adjusting brake moment which keeps the automatic brake adjuster out of failure, we build a mechanical model of one-way clutch according to the structure and the working principle of one-way clutch. The ranges of adjusting brake moment both clockwise and anti-clockwise can be calculated through the mechanical model of one-way clutch. Its critical moment, as well, are picked up as the ideal values of adjusting brake moment to evaluate the acceptable quality of one-way clutch of automatic brake adjuster. we calculate the ideal values of critical moment depending on the different structure of one-way clutch based on its mechanical model before the adjusting brake moment test begin. In addition, an experimental apparatus, which the uncertainty of measurement is ±0.1Nm, is specially designed to test the adjusting brake moment both clockwise and anti-clockwise. Than we can judge the acceptable quality of one-way clutch of automatic brake adjuster by comparing the test results and the ideal values instead of the EXP. In fact, the evaluation standard of adjusting brake moment applied on the project are still using the EXP provided by manufacturer currently in China, but it would be unavailable when the material of one-way clutch changed. Five kinds of automatic brake adjusters are used in the verification experiment to verify the accuracy of the test method. The experimental results show that the experimental values of adjusting brake moment both clockwise and anti-clockwise are within the ranges of theoretical results. The testing method provided by this article vividly meet the requirements of manufacturer's standard.

  11. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  12. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  13. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico; Kryshtafovych, Andriy; Tramontano, Anna

    2009-01-01

    established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic

  14. Deep Predictive Models in Interactive Music

    OpenAIRE

    Martin, Charles P.; Ellefsen, Kai Olav; Torresen, Jim

    2018-01-01

    Automatic music generation is a compelling task where much recent progress has been made with deep learning models. In this paper, we ask how these models can be integrated into interactive music systems; how can they encourage or enhance the music making of human users? Musical performance requires prediction to operate instruments, and perform in groups. We argue that predictive models could help interactive systems to understand their temporal context, and ensemble behaviour. Deep learning...

  15. Automatic exchange unit for control rod drive device

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To enable automatic reoperation and continuation without external power interruption remedy device at the time of recovering the interrupted power soruce during automatic positioning operation. Constitution: In case of an automatic exchange unit for a control rod drive device of the control type for setting the deviation between the positioning target position and the present position of the device to zero, the position data of the drive device of the positioning target value of the device is automatically read, and an interlock of operation inhibit is applied to a control system until the data reading is completed and automatic operation start or restart conditions are sequentially confirmed. After the confirmation, the interlock is released to start the automatic operation or reoperation. Accordingly, the automatic operation can be safely restarted and continued. (Yoshihara, H.)

  16. Automatic learning-based beam angle selection for thoracic IMRT

    International Nuclear Information System (INIS)

    Amit, Guy; Marshall, Andrea; Purdie, Thomas G.; Jaffray, David A.; Levinshtein, Alex; Hope, Andrew J.; Lindsay, Patricia; Pekar, Vladimir

    2015-01-01

    Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume

  17. User's operating procedures. Volume 1: Scout project information programs

    Science.gov (United States)

    Harris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the Scout Project Automatic Data System, called SPADS is given. SPADS is the result of the past seven years of software development on a Prime minicomputer located at the Scout Project Office. SPADS was developed as a single entry, multiple cross reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. The instructions to operate the Scout Project Information programs in data retrieval and file maintenance via the user friendly menu drivers is presented.

  18. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  19. Automatic single- and multi-label enzymatic function prediction by machine learning

    Directory of Open Access Journals (Sweden)

    Shervine Amidi

    2017-03-01

    Full Text Available The number of protein structures in the PDB database has been increasing more than 15-fold since 1999. The creation of computational models predicting enzymatic function is of major importance since such models provide the means to better understand the behavior of newly discovered enzymes when catalyzing chemical reactions. Until now, single-label classification has been widely performed for predicting enzymatic function limiting the application to enzymes performing unique reactions and introducing errors when multi-functional enzymes are examined. Indeed, some enzymes may be performing different reactions and can hence be directly associated with multiple enzymatic functions. In the present work, we propose a multi-label enzymatic function classification scheme that combines structural and amino acid sequence information. We investigate two fusion approaches (in the feature level and decision level and assess the methodology for general enzymatic function prediction indicated by the first digit of the enzyme commission (EC code (six main classes on 40,034 enzymes from the PDB database. The proposed single-label and multi-label models predict correctly the actual functional activities in 97.8% and 95.5% (based on Hamming-loss of the cases, respectively. Also the multi-label model predicts all possible enzymatic reactions in 85.4% of the multi-labeled enzymes when the number of reactions is unknown. Code and datasets are available at https://figshare.com/s/a63e0bafa9b71fc7cbd7.

  20. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  1. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    Science.gov (United States)

    Erin S. Brooks; Mariana Dobre; William J. Elliot; Joan Q. Wu; Jan Boll

    2016-01-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to...

  2. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  3. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  4. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  5. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  6. Prediction of Protein Hotspots from Whole Protein Sequences by a Random Projection Ensemble System

    Directory of Open Access Journals (Sweden)

    Jinjian Jiang

    2017-07-01

    Full Text Available Hotspot residues are important in the determination of protein-protein interactions, and they always perform specific functions in biological processes. The determination of hotspot residues is by the commonly-used method of alanine scanning mutagenesis experiments, which is always costly and time consuming. To address this issue, computational methods have been developed. Most of them are structure based, i.e., using the information of solved protein structures. However, the number of solved protein structures is extremely less than that of sequences. Moreover, almost all of the predictors identified hotspots from the interfaces of protein complexes, seldom from the whole protein sequences. Therefore, determining hotspots from whole protein sequences by sequence information alone is urgent. To address the issue of hotspot predictions from the whole sequences of proteins, we proposed an ensemble system with random projections using statistical physicochemical properties of amino acids. First, an encoding scheme involving sequence profiles of residues and physicochemical properties from the AAindex1 dataset is developed. Then, the random projection technique was adopted to project the encoding instances into a reduced space. Then, several better random projections were obtained by training an IBk classifier based on the training dataset, which were thus applied to the test dataset. The ensemble of random projection classifiers is therefore obtained. Experimental results showed that although the performance of our method is not good enough for real applications of hotspots, it is very promising in the determination of hotspot residues from whole sequences.

  7. Contaminants and nutrients in variable sea areas (Canvas). Application of automatic monitoring stations in the German marine environment

    International Nuclear Information System (INIS)

    Nies, H.; Bruegge, B.; Sterzenbach, D.; Knauth, H.D.; Schroeder, F.

    1999-01-01

    Permanent observation of parameters at sea stations can only be obtained by automatic sampling. The MERMAID technique developed in former projects provides a possibility to run automatic stations within the German MARNET measuring stations to obtain data on nutrients concentration on line and to collect organic micropollutants and the radionuclide 137 Cs by solid phase extraction from seawater and subsequent analysis in the laboratory. The BSH MARNET consists of ten stations located in the German Bight sector of the North Sea and the western Baltic. First results from the time series of nutrient and organic micropollutant concentrations has been presented

  8. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT

    DEFF Research Database (Denmark)

    Murphy, Keelin; Pluim, Josien P. W.; Rikxoort, Eva M. van

    2012-01-01

    and its results; (b) verify that the quantitative, regional ventilation measurements acquired through CT are meaningful for pulmonary function analysis; (c) identify the most effective of the calculated measurements in predicting pulmonary function; and (d) demonstrate the potential of the system...... disorder). Lungs, fissures, airways, lobes, and vessels are automatically segmented in both scans and the expiration scan is registered with the inspiration scan using a fully automatic nonrigid registration algorithm. Segmentations and registrations are examined and scored by expert observers to analyze...... to have good correlation with spirometry results, with several having correlation coefficients, r, in the range of 0.85–0.90. The best performing kNN classifier succeeded in classifying 67% of subjects into the correct COPD GOLD stage, with a further 29% assigned to a class neighboring the correct one...

  9. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  10. Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system

    Science.gov (United States)

    Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong

    2018-01-01

    We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.

  11. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  12. Identification of conductive hearing loss using air conduction tests alone: reliability and validity of an automatic test battery.

    Science.gov (United States)

    Convery, Elizabeth; Keidser, Gitte; Seeto, Mark; Freeston, Katrina; Zhou, Dan; Dillon, Harvey

    2014-01-01

    The primary objective of this study was to determine whether a combination of automatically administered pure-tone audiometry and a tone-in-noise detection task, both delivered via an air conduction (AC) pathway, could reliably and validly predict the presence of a conductive component to the hearing loss. The authors hypothesized that performance on the battery of tests would vary according to hearing loss type. A secondary objective was to evaluate the reliability and validity of a novel automatic audiometry algorithm to assess its suitability for inclusion in the test battery. Participants underwent a series of hearing assessments that were conducted in a randomized order: manual pure-tone air conduction audiometry and bone conduction audiometry; automatic pure-tone air conduction audiometry; and an automatic tone-in-noise detection task. The automatic tests were each administered twice. The ability of the automatic test battery to: (a) predict the presence of an air-bone gap (ABG); and (b) accurately measure AC hearing thresholds was assessed against the results of manual audiometry. Test-retest conditions were compared to determine the reliability of each component of the automatic test battery. Data were collected on 120 ears from normal-hearing and conductive, sensorineural, and mixed hearing-loss subgroups. Performance differences between different types of hearing loss were observed. Ears with a conductive component (conductive and mixed ears) tended to have normal signal to noise ratios (SNR) despite impaired thresholds in quiet, while ears without a conductive component (normal and sensorineural ears) demonstrated, on average, an increasing relationship between their thresholds in quiet and their achieved SNR. Using the relationship between these two measures among ears with no conductive component as a benchmark, the likelihood that an ear has a conductive component can be estimated based on the deviation from this benchmark. The sensitivity and

  13. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  14. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    Science.gov (United States)

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks.

  15. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    International Nuclear Information System (INIS)

    Jang, Yu Jin

    2013-01-01

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  16. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Yu Jin [Dongguk University, GyeongJu (Korea, Republic of)

    2013-07-15

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  17. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  18. Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

    CERN Multimedia

    1969-01-01

    Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

  19. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  20. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  1. General collaboration offer of Johnson Controls regarding the performance of air conditioning automatic control systems and other buildings` automatic control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gniazdowski, J.

    1995-12-31

    JOHNSON CONTROLS manufactures measuring and control equipment (800 types) and is as well a {open_quotes}turn-key{close_quotes} supplier of complete automatic controls systems for heating, air conditioning, ventilation and refrigerating engineering branches. The Company also supplies Buildings` Computer-Based Supervision and Monitoring Systems that may be applied in both small and large structures. Since 1990 the company has been performing full-range trade and contracting activities on the Polish market. We have our own well-trained technical staff and we collaborate with a series of designing and contracting enterprises that enable us to have our projects carried out all over Poland. The prices of our supplies and services correspond with the level of the Polish market.

  2. Automatic assessment of functional health decline in older adults based on smart home data.

    Science.gov (United States)

    Alberdi Aramendi, Ane; Weakley, Alyssa; Aztiria Goenaga, Asier; Schmitter-Edgecombe, Maureen; Cook, Diane J

    2018-05-01

    In the context of an aging population, tools to help elderly to live independently must be developed. The goal of this paper is to evaluate the possibility of using unobtrusively collected activity-aware smart home behavioral data to automatically detect one of the most common consequences of aging: functional health decline. After gathering the longitudinal smart home data of 29 older adults for an average of >2 years, we automatically labeled the data with corresponding activity classes and extracted time-series statistics containing 10 behavioral features. Using this data, we created regression models to predict absolute and standardized functional health scores, as well as classification models to detect reliable absolute change and positive and negative fluctuations in everyday functioning. Functional health was assessed every six months by means of the Instrumental Activities of Daily Living-Compensation (IADL-C) scale. Results show that total IADL-C score and subscores can be predicted by means of activity-aware smart home data, as well as a reliable change in these scores. Positive and negative fluctuations in everyday functioning are harder to detect using in-home behavioral data, yet changes in social skills have shown to be predictable. Future work must focus on improving the sensitivity of the presented models and performing an in-depth feature selection to improve overall accuracy. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  4. Novel Automatic Filter-Class Feature Selection for Machine Learning Regression

    DEFF Research Database (Denmark)

    Wollsen, Morten Gill; Hallam, John; Jørgensen, Bo Nørregaard

    2017-01-01

    With the increased focus on application of Big Data in all sectors of society, the performance of machine learning becomes essential. Efficient machine learning depends on efficient feature selection algorithms. Filter feature selection algorithms are model-free and therefore very fast, but require...... model in the feature selection process. PCA is often used in machine learning litterature and can be considered the default feature selection method. RDESF outperformed PCA in both experiments in both prediction error and computational speed. RDESF is a new step into filter-based automatic feature...

  5. Automatic Adviser on Mobile Objects Status Identification and Classification

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Saryan, A. S.

    2018-05-01

    A mobile object status identification task is defined within the image discrimination theory. It is proposed to classify objects into three classes: object operation status; its maintenance is required and object should be removed from the production process. Two methods were developed to construct the separating boundaries between the designated classes: a) using statistical information on the research objects executed movement, b) basing on regulatory documents and expert commentary. Automatic Adviser operation simulation and the operation results analysis complex were synthesized. Research results are commented using a specific example of cuts rolling from the hump yard. The work was supported by Russian Fundamental Research Fund, project No. 17-20-01040.

  6. Comparative Human and Automatic Evaluation of Glass-Box and Black-Box Approaches to Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Torregrosa Daniel

    2017-06-01

    Full Text Available Interactive translation prediction (ITP is a modality of computer-aided translation that assists professional translators by offering context-based computer-generated continuation suggestions as they type. While most state-of-the-art ITP systems follow a glass-box approach, meaning that they are tightly coupled to an adapted machine translation system, a black-box approach which does not need access to the inner workings of the bilingual resources used to generate the suggestions has been recently proposed in the literature: this new approach allows new sources of bilingual information to be included almost seamlessly. In this paper, we compare for the first time the glass-box and the black-box approaches by means of an automatic evaluation of translation tasks between related languages such as English–Spanish and unrelated ones such as Arabic–English and English–Chinese, showing that, with our setup, 20%–50% of keystrokes could be saved using either method and that the black-box approach outperformed the glass-box one in five out of six scenarios operating under similar conditions. We also performed a preliminary human evaluation of English to Spanish translation for both approaches. On average, the evaluators saved 10% keystrokes and were 4% faster with the black-box approach, and saved 15% keystrokes and were 12% slower with the glass-box one; but they could have saved 51% and 69% keystrokes respectively if they had used all the compatible suggestions. Users felt the suggestions helped them to translate faster and easier. All the tools used to perform the evaluation are available as free/open–source software.

  7. Simulation and Prediction of Groundwater Pollution from Planned Feed Additive Project in Nanning City Based on GMS Model

    Science.gov (United States)

    Liang, Yimin; Lan, Junkang; Wen, Zhixiong

    2018-01-01

    In order to predict the pollution of underground aquifers and rivers by the proposed project, Specialized hydrogeological investigation was carried out. After hydrogeological surveying and mapping, drilling, and groundwater level monitoring, the scope of the hydrogeological unit and the regional hydrogeological condition were found out. The permeability coefficients of the aquifers were also obtained by borehole water injection tests. In order to predict the impact on groundwater environment by the project, a GMS software was used in numerical simulation. The simulation results show that when unexpected sewage leakage accident happened, the pollutants will be gradually diluted by groundwater, and the diluted contaminants will slowly spread to southeast with groundwater flow, eventually they are discharged into Gantang River. However, the process of the pollutants discharging into the river is very long, the long-term dilution of the river water will keep Gantang River from being polluted.

  8. Automatic registration method for multisensor datasets adopted for dimensional measurements on cutting tools

    International Nuclear Information System (INIS)

    Shaw, L; Mehari, F; Weckenmann, A; Ettl, S; Häusler, G

    2013-01-01

    Multisensor systems with optical 3D sensors are frequently employed to capture complete surface information by measuring workpieces from different views. During coarse and fine registration the resulting datasets are afterward transformed into one common coordinate system. Automatic fine registration methods are well established in dimensional metrology, whereas there is a deficit in automatic coarse registration methods. The advantage of a fully automatic registration procedure is twofold: it enables a fast and contact-free alignment and further a flexible application to datasets of any kind of optical 3D sensor. In this paper, an algorithm adapted for a robust automatic coarse registration is presented. The method was originally developed for the field of object reconstruction or localization. It is based on a segmentation of planes in the datasets to calculate the transformation parameters. The rotation is defined by the normals of three corresponding segmented planes of two overlapping datasets, while the translation is calculated via the intersection point of the segmented planes. First results have shown that the translation is strongly shape dependent: 3D data of objects with non-orthogonal planar flanks cannot be registered with the current method. In the novel supplement for the algorithm, the translation is additionally calculated via the distance between centroids of corresponding segmented planes, which results in more than one option for the transformation. A newly introduced measure considering the distance between the datasets after coarse registration evaluates the best possible transformation. Results of the robust automatic registration method are presented on the example of datasets taken from a cutting tool with a fringe-projection system and a focus-variation system. The successful application in dimensional metrology is proven with evaluations of shape parameters based on the registered datasets of a calibrated workpiece. (paper)

  9. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  10. RESEARCH ON THE CONSTRUCTION OF REMOTE SENSING AUTOMATIC INTERPRETATION SYMBOL BIG DATA

    Directory of Open Access Journals (Sweden)

    Y. Gao

    2018-04-01

    Full Text Available Remote sensing automatic interpretation symbol (RSAIS is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013–2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  11. Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data

    Science.gov (United States)

    Gao, Y.; Liu, R.; Liu, J.; Cheng, T.

    2018-04-01

    Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  12. Predicting severe injury using vehicle telemetry data.

    Science.gov (United States)

    Ayoung-Chee, Patricia; Mack, Christopher D; Kaufman, Robert; Bulger, Eileen

    2013-01-01

    In 2010, the National Highway Traffic Safety Administration standardized collision data collected by event data recorders, which may help determine appropriate emergency medical service (EMS) response. Previous models (e.g., General Motors ) predict severe injury (Injury Severity Score [ISS] > 15) using occupant demographics and collision data. Occupant information is not automatically available, and 12% of calls from advanced automatic collision notification providers are unanswered. To better inform EMS triage, our goal was to create a predictive model only using vehicle collision data. Using the National Automotive Sampling System Crashworthiness Data System data set, we included front-seat occupants in late-model vehicles (2000 and later) in nonrollover and rollover crashes in years 2000 to 2010. Telematic (change in velocity, direction of force, seat belt use, vehicle type and curb weight, as well as multiple impact) and nontelematic variables (maximum intrusion, narrow impact, and passenger ejection) were included. Missing data were multiply imputed. The University of Washington model was tested to predict severe injury before application of guidelines (Step 0) and for occupants who did not meet Steps 1 and 2 criteria (Step 3) of the Centers for Disease Control and Prevention Field Triage Guidelines. A probability threshold of 20% was chosen in accordance with Centers for Disease Control and Prevention recommendations. There were 28,633 crashes, involving 33,956 vehicles and 52,033 occupants, of whom 9.9% had severe injury. At Step 0, the University of Washington model sensitivity was 40.0% and positive predictive value (PPV) was 20.7%. At Step 3, the sensitivity was 32.3 % and PPV was 10.1%. Model analysis excluding nontelematic variables decreased sensitivity and PPV. The sensitivity of the re-created General Motors model was 38.5% at Step 0 and 28.1% at Step 3. We designed a model using only vehicle collision data that was predictive of severe injury at

  13. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    Science.gov (United States)

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  14. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  15. Understanding Applications of Project Planning and Scheduling in Construction Projects

    OpenAIRE

    AlNasseri, Hammad Abdullah

    2015-01-01

    Construction project life-cycle processes must be managed in a more effective and predictable way to meet project stakeholders’ needs. However, there is increasing concern about whether know-how effectively improves understanding of underlying theories of project management processes for construction organizations and their project managers. Project planning and scheduling are considered as key and challenging tools in controlling and monitoring project performance, but many worldwide constru...

  16. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  17. A GLOBAL ASSESSMENT OF SOLAR ENERGY RESOURCES: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    Science.gov (United States)

    Zhang, T.; Stackhouse, P. W., Jr.; Chandler, W.; Hoell, J. M.; Westberg, D.; Whitlock, C. H.

    2010-12-01

    NASA's POWER project, or the Prediction of the Worldwide Energy Resources project, synthesizes and analyzes data on a global scale. The products of the project find valuable applications in the solar and wind energy sectors of the renewable energy industries. The primary source data for the POWER project are NASA's World Climate Research Project (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Release 3.0) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (V 4.0.3). Users of the POWER products access the data through NASA's Surface meteorology and Solar Energy (SSE, Version 6.0) website (http://power.larc.nasa.gov). Over 200 parameters are available to the users. The spatial resolution is 1 degree by 1 degree now and will be finer later. The data covers from July 1983 to December 2007, a time-span of 24.5 years, and are provided as 3-hourly, daily and monthly means. As of now, there have been over 18 million web hits and over 4 million data file downloads. The POWER products have been systematically validated against ground-based measurements, and in particular, data from the Baseline Surface Radiation Network (BSRN) archive, and also against the National Solar Radiation Data Base (NSRDB). Parameters such as minimum, maximum, daily mean temperature and dew points, relative humidity and surface pressure are validated against the National Climate Data Center (NCDC) data. SSE feeds data directly into Decision Support Systems including RETScreen International clean energy project analysis software that is written in 36 languages and has greater than 260,000 users worldwide.

  18. Automatic figure ranking and user interfacing for intelligent figure search.

    Directory of Open Access Journals (Sweden)

    Hong Yu

    2010-10-01

    Full Text Available Figures are important experimental results that are typically reported in full-text bioscience articles. Bioscience researchers need to access figures to validate research facts and to formulate or to test novel research hypotheses. On the other hand, the sheer volume of bioscience literature has made it difficult to access figures. Therefore, we are developing an intelligent figure search engine (http://figuresearch.askhermes.org. Existing research in figure search treats each figure equally, but we introduce a novel concept of "figure ranking": figures appearing in a full-text biomedical article can be ranked by their contribution to the knowledge discovery.We empirically validated the hypothesis of figure ranking with over 100 bioscience researchers, and then developed unsupervised natural language processing (NLP approaches to automatically rank figures. Evaluating on a collection of 202 full-text articles in which authors have ranked the figures based on importance, our best system achieved a weighted error rate of 0.2, which is significantly better than several other baseline systems we explored. We further explored a user interfacing application in which we built novel user interfaces (UIs incorporating figure ranking, allowing bioscience researchers to efficiently access important figures. Our evaluation results show that 92% of the bioscience researchers prefer as the top two choices the user interfaces in which the most important figures are enlarged. With our automatic figure ranking NLP system, bioscience researchers preferred the UIs in which the most important figures were predicted by our NLP system than the UIs in which the most important figures were randomly assigned. In addition, our results show that there was no statistical difference in bioscience researchers' preference in the UIs generated by automatic figure ranking and UIs by human ranking annotation.The evaluation results conclude that automatic figure ranking and user

  19. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    International Nuclear Information System (INIS)

    Sharifi, Hamid; Larouche, Daniel

    2015-01-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium–copper alloy (Al–5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie–Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected. (paper)

  20. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  1. ONEMercury: Towards Automatic Annotation of Earth Science Metadata

    Science.gov (United States)

    Tuarob, S.; Pouchard, L. C.; Noy, N.; Horsburgh, J. S.; Palanisamy, G.

    2012-12-01

    Earth sciences have become more data-intensive, requiring access to heterogeneous data collected from multiple places, times, and thematic scales. For example, research on climate change may involve exploring and analyzing observational data such as the migration of animals and temperature shifts across the earth, as well as various model-observation inter-comparison studies. Recently, DataONE, a federated data network built to facilitate access to and preservation of environmental and ecological data, has come to exist. ONEMercury has recently been implemented as part of the DataONE project to serve as a portal for discovering and accessing environmental and observational data across the globe. ONEMercury harvests metadata from the data hosted by multiple data repositories and makes it searchable via a common search interface built upon cutting edge search engine technology, allowing users to interact with the system, intelligently filter the search results on the fly, and fetch the data from distributed data sources. Linking data from heterogeneous sources always has a cost. A problem that ONEMercury faces is the different levels of annotation in the harvested metadata records. Poorly annotated records tend to be missed during the search process as they lack meaningful keywords. Furthermore, such records would not be compatible with the advanced search functionality offered by ONEMercury as the interface requires a metadata record be semantically annotated. The explosion of the number of metadata records harvested from an increasing number of data repositories makes it impossible to annotate the harvested records manually, urging the need for a tool capable of automatically annotating poorly curated metadata records. In this paper, we propose a topic-model (TM) based approach for automatic metadata annotation. Our approach mines topics in the set of well annotated records and suggests keywords for poorly annotated records based on topic similarity. We utilize the

  2. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  3. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  4. Automatic demand response referred to electricity spot price. Demo description

    International Nuclear Information System (INIS)

    Grande, Ove S.; Livik, Klaus; Hals, Arne

    2006-05-01

    This report presents background, technical solution and results from a test project (Demo I) developed in the DRR Norway) project. Software and technology from two different vendors, APAS and Powel ASA, are used to demonstrate a scheme for Automatic Demand Response (ADR) referred to spot price level and a system for documentation of demand response and cost savings. Periods with shortage of energy supply and hardly any investments in new production capacity have turned focus towards the need for increased price elasticity on the demand side in the Nordic power market. The new technology for Automatic Meter Reading (AMR) and Remote Load Control (RLC) provides an opportunity to improve the direct market participation from the demand side by introducing automatic schemes that reduce the need for customer attention to hourly market prices. The low prioritized appliances, and not the total load, are in this report defined as the Demand Response Objects, based on the assumption that there is a limit for what the customers are willing to pay for different uses of electricity. Only disconnection of residential water heaters is included in the demo, due to practical limitations. The test was performed for a group of single family houses over a period of 2 months. All the houses were equipped with a radio controlled 'Ebox' unit attached to the water heater socket. The settlement and invoicing were based on hourly metered values (kWh/h), which means that the customer benefit is equivalent to the accumulated changes in the electricity cost per hour. The actual load reduction is documented by comparison between the real meter values for the period and a reference curve. The curves show significant response to the activated control in the morning hours. In the afternoon it is more difficult to register the response, probably due to 'disturbing' activities like cooking etc. Demo I shows that load reduction referred to spot price level can be done in a smooth way. The experiences

  5. Bayesian Methods for Predicting the Shape of Chinese Yam in Terms of Key Diameters

    Directory of Open Access Journals (Sweden)

    Mitsunori Kayano

    2017-01-01

    Full Text Available This paper proposes Bayesian methods for the shape estimation of Chinese yam (Dioscorea opposita using a few key diameters of yam. Shape prediction of yam is applicable to determining optimal cutoff positions of a yam for producing seed yams. Our Bayesian method, which is a combination of Bayesian estimation model and predictive model, enables automatic, rapid, and low-cost processing of yam. After the construction of the proposed models using a sample data set in Japan, the models provide whole shape prediction of yam based on only a few key diameters. The Bayesian method performed well on the shape prediction in terms of minimizing the mean squared error between measured shape and the prediction. In particular, a multiple regression method with key diameters at two fixed positions attained the highest performance for shape prediction. We have developed automatic, rapid, and low-cost yam-processing machines based on the Bayesian estimation model and predictive model. Development of such shape prediction approaches, including our Bayesian method, can be a valuable aid in reducing the cost and time in food processing.

  6. CAnat: An algorithm for the automatic segmentation of anatomy of medical images

    International Nuclear Information System (INIS)

    Caon, M.; Gobert, L.; Mariusz, B.

    2011-01-01

    Full text: To develop a method to automatically categorise organs and tissues displayed in medical images. Dosimetry calculations using Monte Carlo methods require a mathematical representation of human anatomy e.g. a voxel phantom. For a whole body, their construction involves processing several hundred images to identify each organ and tissue-the process is very time-consuming. This project is developing a Computational Anatomy (CAnat) algorithm to automatically recognise and classify the different tissue in a tomographic image. Methods The algorithm utilizes the Statistical Region Merging technique (SRM). The SRM depends on one estimated parameter. The parameter is a measure of statistical complexity of the image and can be automatically adjusted to suit individual image features. This allows for automatic tuning of coarseness of the overall segmentation as well as object specific selection for further tasks. CAnat is tested on two CT images selected to represent different anatomical complexities. In the mid-thigh image, tissues/. regions of interest are air, fat, muscle, bone marrow and compact bone. In the pelvic image, fat, urinary bladder and anus/colon, muscle, cancellous bone, and compact bone. Segmentation results were evaluated using the Jaccard index which is a measure of set agreement. An index of one indicates perfect agreement between CAnat and manual segmentation. The Jaccard indices for the mid-thigh CT were 0.99, 0.89, 0.97, 0.63 and 0.88, respectively and for the pelvic CT were 0.99, 0.81, 0.77, 0.93, 0.53, 0.76, respectively. Conclusion The high accuracy preliminary segmentation results demonstrate the feasibility of the CAnat algorithm.

  7. Children’s Behavioral Pain Cues: Implicit Automaticity and Control Dimensions in Observational Measures

    Directory of Open Access Journals (Sweden)

    Kamal Kaur Sekhon

    2017-01-01

    Full Text Available Some pain behaviors appear to be automatic, reflexive manifestations of pain, whereas others present as voluntarily controlled. This project examined whether this distinction would characterize pain cues used in observational pain measures for children aged 4–12. To develop a comprehensive list of cues, a systematic literature search of studies describing development of children’s observational pain assessment tools was conducted using MEDLINE, PsycINFO, and Web of Science. Twenty-one articles satisfied the criteria. A total of 66 nonredundant pain behavior items were identified. To determine whether items would be perceived as automatic or controlled, 277 research participants rated each on multiple scales associated with the distinction. Factor analyses yielded three major factors: the “Automatic” factor included items related to facial expression, paralinguistics, and consolability; the “Controlled” factor included items related to intentional movements, verbalizations, and social actions; and the “Ambiguous” factor included items related to voluntary facial expressions. Pain behaviors in observational pain scales for children can be characterized as automatic, controlled, and ambiguous, supporting a dual-processing, neuroregulatory model of pain expression. These dimensions would be expected to influence judgments of the nature and severity of pain being experienced and the extent to which the child is attempting to control the social environment.

  8. AUTOMATIC EXTRACTION AND TOPOLOGY RECONSTRUCTION OF URBAN VIADUCTS FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2015-08-01

    Full Text Available Urban viaducts are important infrastructures for the transportation system of a city. In this paper, an original method is proposed to automatically extract urban viaducts and reconstruct topology of the viaduct network just with airborne LiDAR point cloud data. It will greatly simplify the effort-taking procedure of viaducts extraction and reconstruction. In our method, the point cloud first is filtered to divide all the points into ground points and none-ground points. Region growth algorithm is adopted to find the viaduct points from the none-ground points by the features generated from its general prescriptive designation rules. Then, the viaduct points are projected into 2D images to extract the centerline of every viaduct and generate cubic functions to represent passages of viaducts by least square fitting, with which the topology of the viaduct network can be rebuilt by combining the height information. Finally, a topological graph of the viaducts network is produced. The full-automatic method can potentially benefit the application of urban navigation and city model reconstruction.

  9. User's operating procedures. Volume 3: Projects directorate information programs

    Science.gov (United States)

    Haris, C. G.; Harris, D. K.

    1985-01-01

    A review of the user's operating procedures for the scout project automatic data system, called SPADS is presented. SPADS is the results of the past seven years of software development on a prime mini-computer. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, three of three, provides the instructions to operate the projects directorate information programs in data retrieval and file maintenance via the user friendly menu drivers.

  10. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  11. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin; Zhang, Fa; Gao, Xin

    2017-01-01

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  12. Evaluation of automatic image quality assessment in chest CT - A human cadaver study.

    Science.gov (United States)

    Franck, Caro; De Crop, An; De Roo, Bieke; Smeets, Peter; Vergauwen, Merel; Dewaele, Tom; Van Borsel, Mathias; Achten, Eric; Van Hoof, Tom; Bacher, Klaus

    2017-04-01

    The evaluation of clinical image quality (IQ) is important to optimize CT protocols and to keep patient doses as low as reasonably achievable. Considering the significant amount of effort needed for human observer studies, automatic IQ tools are a promising alternative. The purpose of this study was to evaluate automatic IQ assessment in chest CT using Thiel embalmed cadavers. Chest CT's of Thiel embalmed cadavers were acquired at different exposures. Clinical IQ was determined by performing a visual grading analysis. Physical-technical IQ (noise, contrast-to-noise and contrast-detail) was assessed in a Catphan phantom. Soft and sharp reconstructions were made with filtered back projection and two strengths of iterative reconstruction. In addition to the classical IQ metrics, an automatic algorithm was used to calculate image quality scores (IQs). To be able to compare datasets reconstructed with different kernels, the IQs values were normalized. Good correlations were found between IQs and the measured physical-technical image quality: noise (ρ=-1.00), contrast-to-noise (ρ=1.00) and contrast-detail (ρ=0.96). The correlation coefficients between IQs and the observed clinical image quality of soft and sharp reconstructions were 0.88 and 0.93, respectively. The automatic scoring algorithm is a promising tool for the evaluation of thoracic CT scans in daily clinical practice. It allows monitoring of the image quality of a chest protocol over time, without human intervention. Different reconstruction kernels can be compared after normalization of the IQs. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin

    2017-10-20

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  14. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  15. Seizure Prediction and its Applications

    Science.gov (United States)

    Iasemidis, Leon D.

    2011-01-01

    Epilepsy is characterized by intermittent, paroxysmal, hypersynchronous electrical activity, that may remain localized and/or spread and severely disrupt the brain’s normal multi-task and multi-processing function. Epileptic seizures are the hallmarks of such activity and had been considered unpredictable. It is only recently that research on the dynamics of seizure generation by analysis of the brain’s electrographic activity (EEG) has shed ample light on the predictability of seizures, and illuminated the way to automatic, prospective, long-term prediction of seizures. The ability to issue warnings in real time of impending seizures (e.g., tens of minutes prior to seizure occurrence in the case of focal epilepsy), may lead to novel diagnostic tools and treatments for epilepsy. Applications may range from a simple warning to the patient, in order to avert seizure-associated injuries, to intervention by automatic timely administration of an appropriate stimulus, for example of a chemical nature like an anti-epileptic drug (AED), electromagnetic nature like vagus nerve stimulation (VNS), deep brain stimulation (DBS), transcranial direct current (TDC) or transcranial magnetic stimulation (TMS), and/or of another nature (e.g., ultrasonic, cryogenic, biofeedback operant conditioning). It is thus expected that seizure prediction could readily become an integral part of the treatment of epilepsy through neuromodulation, especially in the new generation of closed-loop seizure control systems. PMID:21939848

  16. Neurophysiology in preschool improves behavioral prediction of reading ability throughout primary school.

    Science.gov (United States)

    Maurer, Urs; Bucher, Kerstin; Brem, Silvia; Benz, Rosmarie; Kranz, Felicitas; Schulz, Enrico; van der Mark, Sanne; Steinhausen, Hans-Christoph; Brandeis, Daniel

    2009-08-15

    More struggling readers could profit from additional help at the beginning of reading acquisition if dyslexia prediction were more successful. Currently, prediction is based only on behavioral assessment of early phonological processing deficits associated with dyslexia, but it might be improved by adding brain-based measures. In a 5-year longitudinal study of children with (n = 21) and without (n = 23) familial risk for dyslexia, we tested whether neurophysiological measures of automatic phoneme and tone deviance processing obtained in kindergarten would improve prediction of reading over behavioral measures alone. Together, neurophysiological and behavioral measures obtained in kindergarten significantly predicted reading in school. Particularly the late mismatch negativity measure that indicated hemispheric lateralization of automatic phoneme processing improved prediction of reading ability over behavioral measures. It was also the only significant predictor for long-term reading success in fifth grade. Importantly, this result also held for the subgroup of children at familial risk. The results demonstrate that brain-based measures of processing deficits associated with dyslexia improve prediction of reading and thus may be further evaluated to complement clinical practice of dyslexia prediction, especially in targeted populations, such as children with a familial risk.

  17. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  18. Automatic classification of liver scintigram patterns by computer

    International Nuclear Information System (INIS)

    Csernay, L.; Csirik, J.

    1976-01-01

    The pattern recognition of projection is one of the problems in the automatic evaluation of scintigrams. An algorythm and a computerized programme with the ability to classify the shapes of liver scintigrams has been elaborated by the authors. The programme differentiates not only normal and pathologic basic forms, but performs the identification of nine normal forms described by the literature. To pattern recognition structural and local parameters of the picture were defined. A detailed mechanism of the programme is given in their reports. The programme can classify 55 out of 60 actual liver scintigrams, a result different from subjective definition obtained in 5 cases. These were normal pattern of liver scans. No wrong definition was obtained when classifying normal and pathologic patterns

  19. Automatic fault tree generation in the EPR PSA project

    International Nuclear Information System (INIS)

    Villatte, N; Nonclercq, P.; Taupy, S.

    2012-01-01

    Tools (KB3 and Atelier EPS) have been developed at EDF to assist the analysts in building fault trees for PSA (Probabilistic Safety Assessment) and importing them into RiskSpectrum (RiskSpectrum is a Swedish code used at EDF for PSA). System modelling is performed using KB3 software with a knowledge base describing generic classes of components with their behaviour and failure modes. Using these classes of components, the analyst can describe (using a graphical system editor): a simplified system diagram from the mechanical system drawings and functional descriptions, the missions of the studied system (in a form of high level fault trees) and its different configurations for the missions. He can also add specific knowledge about the system. Then, the analyst chooses missions and configurations to specify and launch fault trees generations. From the system description, KB3 produces by backward-chaining on rules, detailed system fault trees. These fault trees are finally imported into RiskSpectrum (they are converted by Atelier EPS into a format readable by RiskSpectrum). KB3 and Atelier EPS have been used to create the majority of the fault trees for the EDF EPR Probabilistic Safety Analysis conducted from November 2009 to March 2010. 25 systems were modelled, and 127 fault trees were automatically generated in a rather short time by different analysts with the help of these tools. A feedback shows a lot of advantages to use KB3 and Atelier EPS: homogeneity and consistency between the different generated fault trees, traceability of modelling, control of modelling and last but not least: the automation of detailed fault tree creation relieves the human analyst of this tedious task so that he can focus his attention on more important tasks: modelling the failure of a function. This industrial application has also helped us gather an interesting feedback from the analysts that should help us improve the handling of the tools. We propose in this paper indeed some

  20. Opto-mechanical devices for the Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Swann, T.; Combs, C.; Witt, J.

    1981-01-01

    Antares is a 24-beam CO 2 laser system for controlled fusion research, under construction at Los Alamos National Laboratory. Rapid automatic alignment of this system is required prior to each experimental shot. Unique opto-mechanical alignment devices, which have been developed specifically for this automatic alignment system, are discussed. A variable focus alignment telescope views point light sources. A beam expander/spatial filter processes both a visible Krypton Ion and a 10.6 μm CO 2 alignment laser. The periscope/carousel device provides the means by which the alignment telescope can sequentially view each of twelve optical trains in each power amplifier. The polyhedron alignment device projects a point-light source for both centering and pointing alignment at the polyhedron mirror. The rotating wedge alignment device provides a sequencing point-light source and also compensates for dispersion between visible and 10.6 μm radiation. The back reflector flip in remotely positions point-light sources at the back reflector mirrors. A light source box illuminates optic fibers with high intensity white light which is distributed to the various point-light sources in the system

  1. 49 CFR 236.825 - System, automatic train control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  2. Selection of projects in the regional energy planning

    International Nuclear Information System (INIS)

    Ramirez P, R.; Navas M, F.

    1993-01-01

    The processes of regional energy planning have changed vastly in the last years and it will continue changing in the future for the new norm of the State. This work tries to show the use of systematic tools in the selection of regional energy projects. It discusses a methodology of selection of projects based on a multivariate technical. It is applied in the Southwestern region of Colombia and both selection and priority results are obtained. The designed methodology allows to make the selection of projects in an automatic way with a software designed for such an end. In the case of Southwestern it arrives to a briefcase of projects for an energy plan and made for other races

  3. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  4. Color Segmentation Approach of Infrared Thermography Camera Image for Automatic Fault Diagnosis

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho; Ari Satmoko; Budhi Cynthia Dewi

    2007-01-01

    Predictive maintenance based on fault diagnosis becomes very important in current days to assure the availability and reliability of a system. The main purpose of this research is to configure a computer software for automatic fault diagnosis based on image model acquired from infrared thermography camera using color segmentation approach. This technique detects hot spots in equipment of the plants. Image acquired from camera is first converted to RGB (Red, Green, Blue) image model and then converted to CMYK (Cyan, Magenta, Yellow, Key for Black) image model. Assume that the yellow color in the image represented the hot spot in the equipment, the CMYK image model is then diagnosed using color segmentation model to estimate the fault. The software is configured utilizing Borland Delphi 7.0 computer programming language. The performance is then tested for 10 input infrared thermography images. The experimental result shows that the software capable to detect the faulty automatically with performance value of 80 % from 10 sheets of image input. (author)

  5. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  6. The Use of Automatic Indexing for Authority Control.

    Science.gov (United States)

    Dillon, Martin; And Others

    1981-01-01

    Uses an experimental system for authority control on a collection of bibliographic records to demonstrate the resemblance between thesaurus-based automatic indexing and automatic authority control. Details of the automatic indexing system are given, results discussed, and the benefits of the resemblance examined. Included are a rules appendix and…

  7. 30 CFR 77.1401 - Automatic controls and brakes.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic controls and brakes. 77.1401 Section... MINES Personnel Hoisting § 77.1401 Automatic controls and brakes. Hoists and elevators shall be equipped with overspeed, overwind, and automatic stop controls and with brakes capable of stopping the elevator...

  8. 30 CFR 57.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 57.19006 Section 57.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 57.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  9. 30 CFR 56.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 56.19006 Section 56.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 56.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  10. Aircraft noise effects on sleep: a systematic comparison of EEG awakenings and automatically detected cardiac activations

    International Nuclear Information System (INIS)

    Basner, Mathias; Müller, Uwe; Elmenhorst, Eva-Maria; Kluge, Götz; Griefahn, Barbara

    2008-01-01

    Polysomnography is the gold standard for investigating noise effects on sleep, but data collection and analysis are sumptuous and expensive. We recently developed an algorithm for the automatic identification of cardiac activations associated with cortical arousals, which uses heart rate information derived from a single electrocardiogram (ECG) channel. We hypothesized that cardiac activations can be used as estimates for EEG awakenings. Polysomnographic EEG awakenings and automatically detected cardiac activations were systematically compared using laboratory data of 112 subjects (47 male, mean ± SD age 37.9 ± 13 years), 985 nights and 23 855 aircraft noise events (ANEs). The probability of automatically detected cardiac activations increased monotonically with increasing maximum sound pressure levels of ANEs, exceeding the probability of EEG awakenings by up to 18.1%. If spontaneous reactions were taken into account, exposure–response curves were practically identical for EEG awakenings and cardiac activations. Automatically detected cardiac activations may be used as estimates for EEG awakenings. More investigations are needed to further validate the ECG algorithm in the field and to investigate inter-individual differences in its ability to predict EEG awakenings. This inexpensive, objective and non-invasive method facilitates large-scale field studies on the effects of traffic noise on sleep

  11. New concepts in automatic enforcement. The "Escape" Project, Deliverable 6. Project funded by the European Commission under the Transport RTD Programme of the 4th Framework Programme.

    NARCIS (Netherlands)

    Heidstra, J. Goldenbeld, C. Mäkinen, T. Nilsson, G. & Sagberg, F.

    2010-01-01

    One main reason for automatic enforcement, except of the safety situation, is that the police will not be able to take direct action to each detected violator at normal police enforcement activities in some environments. By using detectors and camera technology the violators can be identified and

  12. Sentence-Level Attachment Prediction

    Science.gov (United States)

    Albakour, M.-Dyaa; Kruschwitz, Udo; Lucas, Simon

    Attachment prediction is the task of automatically identifying email messages that should contain an attachment. This can be useful to tackle the problem of sending out emails but forgetting to include the relevant attachment (something that happens all too often). A common Information Retrieval (IR) approach in analyzing documents such as emails is to treat the entire document as a bag of words. Here we propose a finer-grained analysis to address the problem. We aim at identifying individual sentences within an email that refer to an attachment. If we detect any such sentence, we predict that the email should have an attachment. Using part of the Enron corpus for evaluation we find that our finer-grained approach outperforms previously reported document-level attachment prediction in similar evaluation settings.

  13. AUTOMATIC SEGMENTATION OF BROADCAST AUDIO SIGNALS USING AUTO ASSOCIATIVE NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    P. Dhanalakshmi

    2010-12-01

    Full Text Available In this paper, we describe automatic segmentation methods for audio broadcast data. Today, digital audio applications are part of our everyday lives. Since there are more and more digital audio databases in place these days, the importance of effective management for audio databases have become prominent. Broadcast audio data is recorded from the Television which comprises of various categories of audio signals. Efficient algorithms for segmenting the audio broadcast data into predefined categories are proposed. Audio features namely Linear prediction coefficients (LPC, Linear prediction cepstral coefficients, and Mel frequency cepstral coefficients (MFCC are extracted to characterize the audio data. Auto Associative Neural Networks are used to segment the audio data into predefined categories using the extracted features. Experimental results indicate that the proposed algorithms can produce satisfactory results.

  14. Automatic selection of reference taxa for protein-protein interaction prediction with phylogenetic profiling

    DEFF Research Database (Denmark)

    Simonsen, Martin; Maetschke, S.R.; Ragan, M.A.

    2012-01-01

    Motivation: Phylogenetic profiling methods can achieve good accuracy in predicting protein–protein interactions, especially in prokaryotes. Recent studies have shown that the choice of reference taxa (RT) is critical for accurate prediction, but with more than 2500 fully sequenced taxa publicly......: We present three novel methods for automating the selection of RT, using machine learning based on known protein–protein interaction networks. One of these methods in particular, Tree-Based Search, yields greatly improved prediction accuracies. We further show that different methods for constituting...... phylogenetic profiles often require very different RT sets to support high prediction accuracy....

  15. Genome3D: a UK collaborative project to annotate genomic sequences with predicted 3D structures based on SCOP and CATH domains.

    Science.gov (United States)

    Lewis, Tony E; Sillitoe, Ian; Andreeva, Antonina; Blundell, Tom L; Buchan, Daniel W A; Chothia, Cyrus; Cuff, Alison; Dana, Jose M; Filippis, Ioannis; Gough, Julian; Hunter, Sarah; Jones, David T; Kelley, Lawrence A; Kleywegt, Gerard J; Minneci, Federico; Mitchell, Alex; Murzin, Alexey G; Ochoa-Montaño, Bernardo; Rackham, Owen J L; Smith, James; Sternberg, Michael J E; Velankar, Sameer; Yeats, Corin; Orengo, Christine

    2013-01-01

    Genome3D, available at http://www.genome3d.eu, is a new collaborative project that integrates UK-based structural resources to provide a unique perspective on sequence-structure-function relationships. Leading structure prediction resources (DomSerf, FUGUE, Gene3D, pDomTHREADER, Phyre and SUPERFAMILY) provide annotations for UniProt sequences to indicate the locations of structural domains (structural annotations) and their 3D structures (structural models). Structural annotations and 3D model predictions are currently available for three model genomes (Homo sapiens, E. coli and baker's yeast), and the project will extend to other genomes in the near future. As these resources exploit different strategies for predicting structures, the main aim of Genome3D is to enable comparisons between all the resources so that biologists can see where predictions agree and are therefore more trusted. Furthermore, as these methods differ in whether they build their predictions using CATH or SCOP, Genome3D also contains the first official mapping between these two databases. This has identified pairs of similar superfamilies from the two resources at various degrees of consensus (532 bronze pairs, 527 silver pairs and 370 gold pairs).

  16. An integrated automatic system for the eddy-current testing of the steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Hee Gon; Choi, Seong Su [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center

    1995-12-31

    This research project was focused on automation of steam generator tubes inspection for nuclear power plants. ECT (Eddy Current Testing) inspection process in nuclear power plants is classified into 3 subprocesses such as signal acquisition process, signal evaluation process, and inspection planning and data management process. Having been automated individually, these processes were effectively integrated into an automatic inspection system, which was implemented in HP workstation with expert system developed (author). 25 refs., 80 figs.

  17. An integrated automatic system for the eddy-current testing of the steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Hee Gon; Choi, Seong Su [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center

    1996-12-31

    This research project was focused on automation of steam generator tubes inspection for nuclear power plants. ECT (Eddy Current Testing) inspection process in nuclear power plants is classified into 3 subprocesses such as signal acquisition process, signal evaluation process, and inspection planning and data management process. Having been automated individually, these processes were effectively integrated into an automatic inspection system, which was implemented in HP workstation with expert system developed (author). 25 refs., 80 figs.

  18. Automatic airline baggage counting using 3D image segmentation

    Science.gov (United States)

    Yin, Deyu; Gao, Qingji; Luo, Qijun

    2017-06-01

    The baggage number needs to be checked automatically during baggage self-check-in. A fast airline baggage counting method is proposed in this paper using image segmentation based on height map which is projected by scanned baggage 3D point cloud. There is height drop in actual edge of baggage so that it can be detected by the edge detection operator. And then closed edge chains are formed from edge lines that is linked by morphological processing. Finally, the number of connected regions segmented by closed chains is taken as the baggage number. Multi-bag experiment that is performed on the condition of different placement modes proves the validity of the method.

  19. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  20. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  1. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  2. Automatic recloser circuit breaker integrated with GSM technology for power system notification

    Science.gov (United States)

    Lada, M. Y.; Khiar, M. S. A.; Ghani, S. A.; Nawawi, M. R. M.; Rahim, N. H.; Sinar, L. O. M.

    2015-05-01

    Lightning is one type of transient faults that usually cause the circuit breaker in the distribution board trip due to overload current detection. The instant tripping condition in the circuit breakers clears the fault in the system. Unfortunately most circuit breakers system is manually operated. The power line will be effectively re-energized after the clearing fault process is finished. Auto-reclose circuit is used on the transmission line to carry out the duty of supplying quality electrical power to customers. In this project, an automatic reclose circuit breaker for low voltage usage is designed. The product description is the Auto Reclose Circuit Breaker (ARCB) will trip if the current sensor detects high current which exceeds the rated current for the miniature circuit breaker (MCB) used. Then the fault condition will be cleared automatically and return the power line to normal condition. The Global System for Mobile Communication (GSM) system will send SMS to the person in charge if the tripping occurs. If the over current occurs in three times, the system will fully trip (open circuit) and at the same time will send an SMS to the person in charge. In this project a 1 A is set as the rated current and any current exceeding a 1 A will cause the system to trip or interrupted. This system also provides an additional notification for user such as the emergency light and warning system.

  3. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  4. Are automatic systems the future of motorcycle safety? A novel methodology to prioritize potential safety solutions based on their projected effectiveness.

    Science.gov (United States)

    Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Baldanzini, Niccolò; Happee, Riender; Pierini, Marco

    2017-11-17

    Motorcycle riders are involved in significantly more crashes per kilometer driven than passenger car drivers. Nonetheless, the development and implementation of motorcycle safety systems lags far behind that of passenger cars. This research addresses the identification of the most effective motorcycle safety solutions in the context of different countries. A knowledge-based system of motorcycle safety (KBMS) was developed to assess the potential for various safety solutions to mitigate or avoid motorcycle crashes. First, a set of 26 common crash scenarios was identified from the analysis of multiple crash databases. Second, the relative effectiveness of 10 safety solutions was assessed for the 26 crash scenarios by a panel of experts. Third, relevant information about crashes was used to weigh the importance of each crash scenario in the region studied. The KBMS method was applied with an Italian database, with a total of more than 1 million motorcycle crashes in the period 2000-2012. When applied to the Italian context, the KBMS suggested that automatic systems designed to compensate for riders' or drivers' errors of commission or omission are the potentially most effective safety solution. The KBMS method showed an effective way to compare the potential of various safety solutions, through a scored list with the expected effectiveness of each safety solution for the region to which the crash data belong. A comparison of our results with a previous study that attempted a systematic prioritization of safety systems for motorcycles (PISa project) showed an encouraging agreement. Current results revealed that automatic systems have the greatest potential to improve motorcycle safety. Accumulating and encoding expertise in crash analysis from a range of disciplines into a scalable and reusable analytical tool, as proposed with the use of KBMS, has the potential to guide research and development of effective safety systems. As the expert assessment of the crash

  5. Monitoring caustic injuries from emergency department databases using automatic keyword recognition software.

    Science.gov (United States)

    Vignally, P; Fondi, G; Taggi, F; Pitidis, A

    2011-03-31

    In Italy the European Union Injury Database reports the involvement of chemical products in 0.9% of home and leisure accidents. The Emergency Department registry on domestic accidents in Italy and the Poison Control Centres record that 90% of cases of exposure to toxic substances occur in the home. It is not rare for the effects of chemical agents to be observed in hospitals, with a high potential risk of damage - the rate of this cause of hospital admission is double the domestic injury average. The aim of this study was to monitor the effects of injuries caused by caustic agents in Italy using automatic free-text recognition in Emergency Department medical databases. We created a Stata software program to automatically identify caustic or corrosive injury cases using an agent-specific list of keywords. We focused attention on the procedure's sensitivity and specificity. Ten hospitals in six regions of Italy participated in the study. The program identified 112 cases of injury by caustic or corrosive agents. Checking the cases by quality controls (based on manual reading of ED reports), we assessed 99 cases as true positive, i.e. 88.4% of the patients were automatically recognized by the software as being affected by caustic substances (99% CI: 80.6%- 96.2%), that is to say 0.59% (99% CI: 0.45%-0.76%) of the whole sample of home injuries, a value almost three times as high as that expected (p < 0.0001) from European codified information. False positives were 11.6% of the recognized cases (99% CI: 5.1%- 21.5%). Our automatic procedure for caustic agent identification proved to have excellent product recognition capacity with an acceptable level of excess sensitivity. Contrary to our a priori hypothesis, the automatic recognition system provided a level of identification of agents possessing caustic effects that was significantly much greater than was predictable on the basis of the values from current codifications reported in the European Database.

  6. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  7. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  8. Bite weight prediction from acoustic recognition of chewing

    NARCIS (Netherlands)

    Amft, O.D.; Kusserow, M.; Tröster, G.

    2009-01-01

    Automatic dietary monitoring (ADM) offers new perspectives to reduce the self-reporting burden for participants in diet coaching programs. This paper presents an approach to predict weight of individual bites taken. We utilize a pattern recognition procedure to spot chewing cycles and food type in

  9. Fault Locating, Prediction and Protection (FLPPS)

    Energy Technology Data Exchange (ETDEWEB)

    Yinger, Robert, J.; Venkata, S., S.; Centeno, Virgilio

    2010-09-30

    One of the main objectives of this DOE-sponsored project was to reduce customer outage time. Fault location, prediction, and protection are the most important aspects of fault management for the reduction of outage time. In the past most of the research and development on power system faults in these areas has focused on transmission systems, and it is not until recently with deregulation and competition that research on power system faults has begun to focus on the unique aspects of distribution systems. This project was planned with three Phases, approximately one year per phase. The first phase of the project involved an assessment of the state-of-the-art in fault location, prediction, and detection as well as the design, lab testing, and field installation of the advanced protection system on the SCE Circuit of the Future located north of San Bernardino, CA. The new feeder automation scheme, with vacuum fault interrupters, will limit the number of customers affected by the fault. Depending on the fault location, the substation breaker might not even trip. Through the use of fast communications (fiber) the fault locations can be determined and the proper fault interrupting switches opened automatically. With knowledge of circuit loadings at the time of the fault, ties to other circuits can be closed automatically to restore all customers except the faulted section. This new automation scheme limits outage time and increases reliability for customers. The second phase of the project involved the selection, modeling, testing and installation of a fault current limiter on the Circuit of the Future. While this project did not pay for the installation and testing of the fault current limiter, it did perform the evaluation of the fault current limiter and its impacts on the protection system of the Circuit of the Future. After investigation of several fault current limiters, the Zenergy superconducting, saturable core fault current limiter was selected for

  10. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  11. Automatic bad channel detection in intracranial electroencephalographic recordings using ensemble machine learning.

    Science.gov (United States)

    Tuyisenge, Viateur; Trebaul, Lena; Bhattacharjee, Manik; Chanteloup-Forêt, Blandine; Saubat-Guigui, Carole; Mîndruţă, Ioana; Rheims, Sylvain; Maillard, Louis; Kahane, Philippe; Taussig, Delphine; David, Olivier

    2018-03-01

    Intracranial electroencephalographic (iEEG) recordings contain "bad channels", which show non-neuronal signals. Here, we developed a new method that automatically detects iEEG bad channels using machine learning of seven signal features. The features quantified signals' variance, spatial-temporal correlation and nonlinear properties. Because the number of bad channels is usually much lower than the number of good channels, we implemented an ensemble bagging classifier known to be optimal in terms of stability and predictive accuracy for datasets with imbalanced class distributions. This method was applied on stereo-electroencephalographic (SEEG) signals recording during low frequency stimulations performed in 206 patients from 5 clinical centers. We found that the classification accuracy was extremely good: It increased with the number of subjects used to train the classifier and reached a plateau at 99.77% for 110 subjects. The classification performance was thus not impacted by the multicentric nature of data. The proposed method to automatically detect bad channels demonstrated convincing results and can be envisaged to be used on larger datasets for automatic quality control of iEEG data. This is the first method proposed to classify bad channels in iEEG and should allow to improve the data selection when reviewing iEEG signals. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  12. Prediction of Metabolic Pathway Involvement in Prokaryotic UniProtKB Data by Association Rule Mining

    KAUST Repository

    Boudellioua, Imene; Saidi, Rabie; Hoehndorf, Robert; Martin, Maria J.; Solovyev, Victor

    2016-01-01

    The widening gap between known proteins and their functions has encouraged the development of methods to automatically infer annotations. Automatic functional annotation of proteins is expected to meet the conflicting requirements of maximizing annotation coverage, while minimizing erroneous functional assignments. This trade-off imposes a great challenge in designing intelligent systems to tackle the problem of automatic protein annotation. In this work, we present a system that utilizes rule mining techniques to predict metabolic pathways in prokaryotes. The resulting knowledge represents predictive models that assign pathway involvement to UniProtKB entries. We carried out an evaluation study of our system performance using cross-validation technique. We found that it achieved very promising results in pathway identification with an F1-measure of 0.982 and an AUC of 0.987. Our prediction models were then successfully applied to 6.2 million UniProtKB/TrEMBL reference proteome entries of prokaryotes. As a result, 663,724 entries were covered, where 436,510 of them lacked any previous pathway annotations.

  13. Prediction of Metabolic Pathway Involvement in Prokaryotic UniProtKB Data by Association Rule Mining

    KAUST Repository

    Boudellioua, Imene

    2016-07-08

    The widening gap between known proteins and their functions has encouraged the development of methods to automatically infer annotations. Automatic functional annotation of proteins is expected to meet the conflicting requirements of maximizing annotation coverage, while minimizing erroneous functional assignments. This trade-off imposes a great challenge in designing intelligent systems to tackle the problem of automatic protein annotation. In this work, we present a system that utilizes rule mining techniques to predict metabolic pathways in prokaryotes. The resulting knowledge represents predictive models that assign pathway involvement to UniProtKB entries. We carried out an evaluation study of our system performance using cross-validation technique. We found that it achieved very promising results in pathway identification with an F1-measure of 0.982 and an AUC of 0.987. Our prediction models were then successfully applied to 6.2 million UniProtKB/TrEMBL reference proteome entries of prokaryotes. As a result, 663,724 entries were covered, where 436,510 of them lacked any previous pathway annotations.

  14. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    clinical datasets produced in the University Hospital of Catanzaro, Italy. DMET Analyzer is a novel tool able to automatically analyse data produced by the DMET-platform in case-control association studies. Using such tool user may avoid wasting time in the manual execution of multiple statistical tests avoiding possible errors and reducing the amount of time needed for a whole experiment. Moreover annotations and the direct link to external databases may increase the biological knowledge extracted. The system is freely available for academic purposes at: https://sourceforge.net/projects/dmetanalyzer/files/

  15. Distributed Research Project Scheduling Based on Multi-Agent Methods

    Directory of Open Access Journals (Sweden)

    Constanta Nicoleta Bodea

    2011-01-01

    Full Text Available Different project planning and scheduling approaches have been developed. The Operational Research (OR provides two major planning techniques: CPM (Critical Path Method and PERT (Program Evaluation and Review Technique. Due to projects complexity and difficulty to use classical methods, new approaches were developed. Artificial Intelligence (AI initially promoted the automatic planner concept, but model-based planning and scheduling methods emerged later on. The paper adresses the project scheduling optimization problem, when projects are seen as Complex Adaptive Systems (CAS. Taken into consideration two different approaches for project scheduling optimization: TCPSP (Time- Constrained Project Scheduling and RCPSP (Resource-Constrained Project Scheduling, the paper focuses on a multiagent implementation in MATLAB for TCSP. Using the research project as a case study, the paper includes a comparison between two multi-agent methods: Genetic Algorithm (GA and Ant Colony Algorithm (ACO.

  16. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  17. Automatic coronary calcium scoring using noncontrast and contrast CT images

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Guanyu, E-mail: yang.list@seu.edu.cn; Chen, Yang; Shu, Huazhong [Laboratory of Image Science and Technology, School of Computer Science and Engineering, Southeast University, No. 2, Si Pai Lou, Nanjing 210096 (China); Centre de Recherche en Information Biomédicale Sino-Français (LIA CRIBs), Nanjing 210096 (China); Key Laboratory of Computer Network and Information Integration, Southeast University, Ministry of Education, Nanjing 210096 (China); Ning, Xiufang; Sun, Qiaoyu [Laboratory of Image Science and Technology, School of Computer Science and Engineering, Southeast University, No. 2, Si Pai Lou, Nanjing 210096 (China); Key Laboratory of Computer Network and Information Integration, Southeast University, Ministry of Education, Nanjing 210096 (China); Coatrieux, Jean-Louis [INSERM-U1099, Rennes F-35000 (France); Labotatoire Traitement du Signal et de l’Image (LTSI), Université de Rennes 1, Campus de Beaulieu, Bat. 22, Rennes 35042 Cedex (France); Centre de Recherche en Information Biomédicale Sino-Français (LIA CRIBs), Nanjing 210096 (China)

    2016-05-15

    Purpose: Calcium scoring is widely used to assess the risk of coronary heart disease (CHD). Accurate coronary artery calcification detection in noncontrast CT image is a prerequisite step for coronary calcium scoring. Currently, calcified lesions in the coronary arteries are manually identified by radiologists in clinical practice. Thus, in this paper, a fully automatic calcium scoring method was developed to alleviate the work load of the radiologists or cardiologists. Methods: The challenge of automatic coronary calcification detection is to discriminate the calcification in the coronary arteries from the calcification in the other tissues. Since the anatomy of coronary arteries is difficult to be observed in the noncontrast CT images, the contrast CT image of the same patient is used to extract the regions of the aorta, heart, and coronary arteries. Then, a patient-specific region-of-interest (ROI) is generated in the noncontrast CT image according to the segmentation results in the contrast CT image. This patient-specific ROI focuses on the regions in the neighborhood of coronary arteries for calcification detection, which can eliminate the calcifications in the surrounding tissues. A support vector machine classifier is applied finally to refine the results by removing possible image noise. Furthermore, the calcified lesions in the noncontrast images belonging to the different main coronary arteries are identified automatically using the labeling results of the extracted coronary arteries. Results: Forty datasets from four different CT machine vendors were used to evaluate their algorithm, which were provided by the MICCAI 2014 Coronary Calcium Scoring (orCaScore) Challenge. The sensitivity and positive predictive value for the volume of detected calcifications are 0.989 and 0.948. Only one patient out of 40 patients had been assigned to the wrong risk category defined according to Agatston scores (0, 1–100, 101–300, >300) by comparing with the ground

  18. Automatic cloud coverage assessment of Formosat-2 image

    Science.gov (United States)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  19. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  20. Evaluation of the Patient Effective Dose in Whole Spine Scanography Based on the Automatic Image Pasting Method for Digital Radiography

    International Nuclear Information System (INIS)

    Kim, Jung-Su; Yoon, Sang-Wook; Seo, Deok-Nam; Nam, So-Ra; Kim, Jung-Min

    2016-01-01

    Whole spine scanography (WSS) is a radiologic examination that requires whole body X-ray exposure. Consequently, the amount of patient radiation exposure is higher than the radiation dose following routine X-ray examination. Several studies have evaluated the patient effective dose (ED) following single exposure film-screen WSS. The objective of this study was to evaluate patient ED during WSS, based on the automatic image pasting method for multiple exposure digital radiography (APMDR). Further, the calculated EDs were compared with the results of previous studies involving single exposure film-screen WSS. We evaluated the ED of 50 consecutive patients (M:F = 28:22) who underwent WSS using APMDR. The anterior-posterior (AP) and lateral (LAT) projection EDs were evaluated based on the Monte Carlo simulation. Using APMDR, the mean number of exposures was 6.1 for AP and 6.5 for LAT projections. LAT projections required more exposures (6.55%) than AP projections. The mean ED was 0.6276 mSv (AP) and 0.6716 mSv (LAT). The mean ED for LAT projections was 0.6061 mSv in automatic exposure control (AEC) and 0.7694 mSv in manual mode. The relationship between dose-area-product (DAP) and ED revealed a proportional correlation (AP, R 2 = 0.943; LAT, R 2 = 0.773). Compared to prior research involving single exposure screen-film WSS, the patient ED following WSS using APMDR was lower on AP than on LAT projections. Despite multiple exposures, ED control is more effective if WSS is performed using APMDR in the AEC mode

  1. Automatic learning of structural knowledge from geographic information for updating land cover maps

    OpenAIRE

    Bayoudh , Meriam; Roux , Emmanuel; Nock , Richard; Richard , G.

    2012-01-01

    International audience; The number of satellites and remote sensing sensors devoted to earth observation becomes increasingly high, providing more and more data and especially images. In the same time the access to such a data and to the tools to process them has been considerably improved. In the presence of such data flow - and regarding the necessity to follow up and predict environmental and societal changes in highly dynamic socio-environmental contexts - we need automatic image interpre...

  2. Speed and automaticity of word recognition - inseparable twins?

    DEFF Research Database (Denmark)

    Poulsen, Mads; Asmussen, Vibeke; Elbro, Carsten

    'Speed and automaticity' of word recognition is a standard collocation. However, it is not clear whether speed and automaticity (i.e., effortlessness) make independent contributions to reading comprehension. In theory, both speed and automaticity may save cognitive resources for comprehension...... processes. Hence, the aim of the present study was to assess the unique contributions of word recognition speed and automaticity to reading comprehension while controlling for decoding speed and accuracy. Method: 139 Grade 5 students completed tests of reading comprehension and computer-based tests of speed...... of decoding and word recognition together with a test of effortlessness (automaticity) of word recognition. Effortlessness was measured in a dual task in which participants were presented with a word enclosed in an unrelated figure. The task was to read the word and decide whether the figure was a triangle...

  3. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  4. Rule Mining Techniques to Predict Prokaryotic Metabolic Pathways

    KAUST Repository

    Saidi, Rabie

    2017-08-28

    It is becoming more evident that computational methods are needed for the identification and the mapping of pathways in new genomes. We introduce an automatic annotation system (ARBA4Path Association Rule-Based Annotator for Pathways) that utilizes rule mining techniques to predict metabolic pathways across wide range of prokaryotes. It was demonstrated that specific combinations of protein domains (recorded in our rules) strongly determine pathways in which proteins are involved and thus provide information that let us very accurately assign pathway membership (with precision of 0.999 and recall of 0.966) to proteins of a given prokaryotic taxon. Our system can be used to enhance the quality of automatically generated annotations as well as annotating proteins with unknown function. The prediction models are represented in the form of human-readable rules, and they can be used effectively to add absent pathway information to many proteins in UniProtKB/TrEMBL database.

  5. Rule Mining Techniques to Predict Prokaryotic Metabolic Pathways

    KAUST Repository

    Saidi, Rabie; Boudellioua, Imene; Martin, Maria J.; Solovyev, Victor

    2017-01-01

    It is becoming more evident that computational methods are needed for the identification and the mapping of pathways in new genomes. We introduce an automatic annotation system (ARBA4Path Association Rule-Based Annotator for Pathways) that utilizes rule mining techniques to predict metabolic pathways across wide range of prokaryotes. It was demonstrated that specific combinations of protein domains (recorded in our rules) strongly determine pathways in which proteins are involved and thus provide information that let us very accurately assign pathway membership (with precision of 0.999 and recall of 0.966) to proteins of a given prokaryotic taxon. Our system can be used to enhance the quality of automatically generated annotations as well as annotating proteins with unknown function. The prediction models are represented in the form of human-readable rules, and they can be used effectively to add absent pathway information to many proteins in UniProtKB/TrEMBL database.

  6. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  7. Application of an Elman neural network to the problem of predicting the throughput of a petroleum collecting station; Previsao da vazao de uma estacao coletora de petroleo utilizando redes neurais de Elman

    Energy Technology Data Exchange (ETDEWEB)

    Paula, Wesley R. de [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Curso de Pos-Graduacao em Informatica; Sousa, Andre G. de [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Curso de Ciencia da Computacao; Gomes, Herman M.; Galvao, Carlos de O. [Universidade Federal de Campina Grande (UFCG), PB (Brazil)

    2004-07-01

    The objective of this paper is to present an initial study on the application of an Elman Neural Network to the problem of predicting the throughput of a petroleum collecting station. This study is part of a wider project, which aims at producing an automatic real-time system to remotely control a petroleum distribution pipeline, in such a way that optimum efficiency can be assured in terms of: (I) maximizing the volume of oil transported; and (II) minimizing energy consumption, risks of failures and damages to the environment. Experiments were carried out to determine the neural network parameters and to examine its performance under varying prediction times in the future. Promising results (with low MSE) have been obtained for predictions obtained up to 10 minutes in the future. (author)

  8. Automatic radiation dose monitoring for CT of trauma patients with different protocols: feasibility and accuracy

    International Nuclear Information System (INIS)

    Higashigaito, K.; Becker, A.S.; Sprengel, K.; Simmen, H.-P.; Wanner, G.; Alkadhi, H.

    2016-01-01

    Aim: To demonstrate the feasibility and accuracy of automatic radiation dose monitoring software for computed tomography (CT) of trauma patients in a clinical setting over time, and to evaluate the potential of radiation dose reduction using iterative reconstruction (IR). Materials and methods: In a time period of 18 months, data from 378 consecutive thoraco-abdominal CT examinations of trauma patients were extracted using automatic radiation dose monitoring software, and patients were split into three cohorts: cohort 1, 64-section CT with filtered back projection, 200 mAs tube current–time product; cohort 2, 128-section CT with IR and identical imaging protocol; cohort 3, 128-section CT with IR, 150 mAs tube current–time product. Radiation dose parameters from the software were compared with the individual patient protocols. Image noise was measured and image quality was semi-quantitatively determined. Results: Automatic extraction of radiation dose metrics was feasible and accurate in all (100%) patients. All CT examinations were of diagnostic quality. There were no differences between cohorts 1 and 2 regarding volume CT dose index (CTDI_v_o_l; p=0.62), dose–length product (DLP), and effective dose (ED, both p=0.95), while noise was significantly lower (chest and abdomen, both −38%, p<0.017). Compared to cohort 1, CTDI_v_o_l, DLP, and ED in cohort 3 were significantly lower (all −25%, p<0.017), similar to the noise in the chest (–32%) and abdomen (–27%, both p<0.017). Compared to cohort 2, CTDI_v_o_l (–28%), DLP, and ED (both –26%) in cohort 3 was significantly lower (all, p<0.017), while noise in the chest (+9%) and abdomen (+18%) was significantly higher (all, p<0.017). Conclusion: Automatic radiation dose monitoring software is feasible and accurate, and can be implemented in a clinical setting for evaluating the effects of lowering radiation doses of CT protocols over time. - Highlights: • Automatic dose monitoring software can be

  9. Fully automatic characterization and data collection from crystals of biological macromolecules

    International Nuclear Information System (INIS)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W.

    2015-01-01

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention

  10. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  11. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  12. Interim evaluation report on research and development of automatic sewing systems; Jido hosei system no kenkyu kaihatsu chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1988-03-01

    The research and development project for automatic sewing systems is aimed at establishing the techniques necessary for developing (automatic sewing systems) which efficiently produce diversified types of products in small quantities, to cope with various requirements, e.g., diversified consumers' needs and reduced cycle periods. This project covers R and D of the sewing preparation/processing, sewing/assembling, cloth handling, and system management/control techniques. The program for developing the total systems and elementary techniques draws the conceptual designs of their functions, performance and shapes, to outline the overall R and D project. The programs for the individual elementary techniques include studies on their basic functions and performance; design works from the basic designs to determine the specifications to the detailed designs of the devices to be developed; construction of the test units on a trial basis; and function confirming tests to confirm operability of the unit components and device performance, where these works are simultaneously implemented. This paper describes the interim results of evaluation of the elements developed for the elementary techniques, summarizing the results obtained so far. It is concluded that most of the targets of the R and D themes are sufficiently achieved by the end of FY 1987, and that the project can be now advanced to the next phase, construction of the test plants. (NEDO)

  13. Automatic Lamp and Fan Control Based on Microcontroller

    Science.gov (United States)

    Widyaningrum, V. T.; Pramudita, Y. D.

    2018-01-01

    In general, automation can be described as a process following pre-determined sequential steps with a little or without any human exertion. Automation is provided with the use of various sensors suitable to observe the production processes, actuators and different techniques and devices. In this research, the automation system developed is an automatic lamp and an automatic fan on the smart home. Both of these systems will be processed using an Arduino Mega 2560 microcontroller. A microcontroller is used to obtain values of physical conditions through sensors connected to it. In the automatic lamp system required sensors to detect the light of the LDR (Light Dependent Resistor) sensor. While the automatic fan system required sensors to detect the temperature of the DHT11 sensor. In tests that have been done lamps and fans can work properly. The lamp can turn on automatically when the light begins to darken, and the lamp can also turn off automatically when the light begins to bright again. In addition, it can concluded also that the readings of LDR sensors are placed outside the room is different from the readings of LDR sensors placed in the room. This is because the light intensity received by the existing LDR sensor in the room is blocked by the wall of the house or by other objects. Then for the fan, it can also turn on automatically when the temperature is greater than 25°C, and the fan speed can also be adjusted. The fan may also turn off automatically when the temperature is less than equal to 25°C.

  14. Historical maintenance relevant information road-map for a self-learning maintenance prediction procedural approach

    Science.gov (United States)

    Morales, Francisco J.; Reyes, Antonio; Cáceres, Noelia; Romero, Luis M.; Benitez, Francisco G.; Morgado, Joao; Duarte, Emanuel; Martins, Teresa

    2017-09-01

    A large percentage of transport infrastructures are composed of linear assets, such as roads and rail tracks. The large social and economic relevance of these constructions force the stakeholders to ensure a prolonged health/durability. Even though, inevitable malfunctioning, breaking down, and out-of-service periods arise randomly during the life cycle of the infrastructure. Predictive maintenance techniques tend to diminish the appearance of unpredicted failures and the execution of needed corrective interventions, envisaging the adequate interventions to be conducted before failures show up. This communication presents: i) A procedural approach, to be conducted, in order to collect the relevant information regarding the evolving state condition of the assets involved in all maintenance interventions; this reported and stored information constitutes a rich historical data base to train Machine Learning algorithms in order to generate reliable predictions of the interventions to be carried out in further time scenarios. ii) A schematic flow chart of the automatic learning procedure. iii) Self-learning rules from automatic learning from false positive/negatives. The description, testing, automatic learning approach and the outcomes of a pilot case are presented; finally some conclusions are outlined regarding the methodology proposed for improving the self-learning predictive capability.

  15. A predictive pilot model for STOL aircraft landing

    Science.gov (United States)

    Kleinman, D. L.; Killingsworth, W. R.

    1974-01-01

    An optimal control approach has been used to model pilot performance during STOL flare and landing. The model is used to predict pilot landing performance for three STOL configurations, each having a different level of automatic control augmentation. Model predictions are compared with flight simulator data. It is concluded that the model can be effective design tool for studying analytically the effects of display modifications, different stability augmentation systems, and proposed changes in the landing area geometry.

  16. Progress report on a fully automatic Gas Tungsten Arc Welding (GTAW) system development

    Energy Technology Data Exchange (ETDEWEB)

    Daumeyer, G.J. III

    1994-12-01

    A plan to develop a fully automatic gas tungsten arc welding (GTAW) system that will utilize a vision-sensing computer (which will provide in-process feedback control) is presently in work. Evaluations of different technological aspects and system design requirements continue. This report summaries major activities in the plan`s successful progress. The technological feasibility of producing the fully automated GTAW system has been proven. The goal of this process development project is to provide a production-ready system within the shortest reasonable time frame.

  17. Automatic control system at the ''Loviisa'' NPP

    International Nuclear Information System (INIS)

    Kukhtevich, I.V.; Mal'tsev, B.K.; Sergievskaya, E.N.

    1980-01-01

    Automatic control system of the Loviisa-1 NPP (Finland) is described. According to operation conditions of Finland power system the Loviisa-1 NPP must operate in the mode of week and day control of loading schedule and participate in current control of power system frequency and capacity. With provision for these requirements NPP is equipped with the all-regime system for automatic control functioning during reactor start-up, shut-down, in normal and transient regimes and in emergency situations. The automatic control system includes: a data subsystem, an automatic control subsystem, a discrete control subsystem including remote, a subsystem for reactor control and protection and overall station system of protections: control and dosimetry inside the reactor. Structures of a data-computer complex, discrete control subsystems, reactor control and protection systems, neutron flux control system, inside-reactor control system, station protection system and system for control of fuel element tightness are presented in short. Two-year experience of the NPP operation confirmed advisability of the chosen volume of automatization. The Loviisa-1 NPP operates successfully in the mode of the week and day control of supervisor schedule and current control of frequency (short-term control)

  18. Homogenisation in project management for large German research projects in the Earth system sciences: overcoming the institutional coordination bias

    Science.gov (United States)

    Rauser, Florian; Vamborg, Freja

    2016-04-01

    The interdisciplinary project on High Definition Clouds and Precipitation for advancing climate prediction HD(CP)2 (hdcp2.eu) is an example for the trend in fundamental research in Europe to increasingly focus on large national and international research programs that require strong scientific coordination. The current system has traditionally been host-based: project coordination activities and funding is placed at the host institute of the central lead PI of the project. This approach is simple and has the advantage of strong collaboration between project coordinator and lead PI, while exhibiting a list of strong, inherent disadvantages that are also mentioned in this session's description: no community best practice development, lack of integration between similar projects, inefficient methodology development and usage, and finally poor career development opportunities for the coordinators. Project coordinators often leave the project before it is finalized, leaving some of the fundamentally important closing processes to the PIs. This systematically prevents the creation of professional science management expertise within academia, which leads to an automatic imbalance that hinders the outcome of large research programs to help future funding decisions. Project coordinators in academia often do not work in a professional project office environment that could distribute activities and use professional tools and methods between different projects. Instead, every new project manager has to focus on methodological work anew (communication infrastructure, meetings, reporting), even though the technological needs of large research projects are similar. This decreases the efficiency of the coordination and leads to funding that is effectively misallocated. We propose to challenge this system by creating a permanent, virtual "Centre for Earth System Science Management CESSMA" (cessma.com), and changing the approach from host- based to centre-based. This should

  19. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Science.gov (United States)

    2010-04-01

    ... SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Equipment § 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Automatic, mechanical, and electronic equipment...

  20. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  1. On a Use Case Points Measurement Tool for Effective Project Management

    OpenAIRE

    Inoue, Katsuro; Kusumoto, Shinji; Tsuda, Michio

    2007-01-01

    Use case point (UCP) method has been proposed to estimate software development effort in early phase of software project and used in a lot of software organizations. This paper briefly describes an automatic use case measurement tool, called U-EST.

  2. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  3. Evaluation of automatic face recognition for automatic border control on actual data recorded of travellers at Schiphol Airport

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Hendrikse, A.J.; Gerritsen, K.J.; Brömme, A.; Busch, C.

    2012-01-01

    Automatic border control at airports using automated facial recognition for checking the passport is becoming more and more common. A problem is that it is not clear how reliable these automatic gates are. Very few independent studies exist that assess the reliability of automated facial recognition

  4. Automatic scanning of NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At the European Laboratory for Particle Physics CERN, personal neutron monitoring for over 4000 collaborators is performed with Kodak NTA film, one of the few suitable dosemeters in the stray radiation environment of a high energy accelerator. After development, films are scanned with a projection microscope. To overcome this lengthy and strenuous procedure an automated analysis system for the dosemeters has been developed. General purpose image recognition software, tailored to the specific needs with a macro language, analyses the digitised microscope image. This paper reports on the successful automatic scanning of NTA films irradiated with neutrons from a /sup 238/Pu-Be source (E approximately=4 MeV), as well as on the extension of the method to neutrons of higher energies. The question of detection limits is discussed in the light of an application of the method in routine personal neutron monitoring. (9 refs).

  5. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  6. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  7. Observed use of automatic seat belts in 1987 cars.

    Science.gov (United States)

    Williams, A F; Wells, J K; Lund, A K; Teed, N

    1989-10-01

    Usage of the automatic belt systems supplied by six large-volume automobile manufacturers to meet the federal requirements for automatic restraints were observed in suburban Washington, D.C., Chicago, Los Angeles, and Philadelphia. The different belt systems studied were: Ford and Toyota (motorized, nondetachable automatic shoulder belt), Nissan (motorized, detachable shoulder belt), VW and Chrysler (nonmotorized, detachable shoulder belt), and GM (nonmotorized detachable lap and shoulder belt). Use of automatic belts was significantly greater than manual belt use in otherwise comparable late-model cars for all manufacturers except Chrysler; in Chrysler cars, automatic belt use was significantly lower than manual belt use. The automatic shoulder belts provided by Ford, Nissan, Toyota, and VW increased use rates to about 90%. Because use rates were lower in Ford cars with manual belts, their increase was greater. GM cars had the smallest increase in use rates; however, lap belt use was highest in GM cars. The other manufacturers supply knee bolsters to supplement shoulder belt protection; all--except VW--also provide manual lap belts, which were used by about half of those who used the automatic shoulder belt. The results indicate that some manufacturers have been more successful than others in providing automatic belt systems that result in high use that, in turn, will mean fewer deaths and injuries in those cars.

  8. Oxide fuel fabrication technology development of the FaCT project (1). Overall review of fuel technology development of the FaCT project

    International Nuclear Information System (INIS)

    Abe, Tomoyuki; Namekawa, Takashi; Tanaka, Kenya

    2011-01-01

    The FaCT project is in progress in Japan for the commercialization of fast reactor cycle system. The development goal of the fuel in the FaCT project is a low-decontaminated TRU homo-recycling in a closed cycle and extension in average discharge burn-up to 150 GWd/t. Research and development on innovative technologies concerning the short process, remote maintenance and cooling system of automatic fuel production equipments, long life cladding material and control of oxygen potential have been conducted in phase I of the FaCT project. As the result of various test including 600 g batch MOX tests, it is concluded that the short process is available to fuel pellet fabrication of the FaCT project. Although cold mock-up tests on test model of some typical process equipments suggest possibilities of remote maintenance of automatic fuel fabrication equipment, it is concluded that it still needs further efforts to judge the operability of the completely remote fabrication for low-decontaminated TRU fuel. A cold mock-up test on fuel pin assembling equipment show that influence of decay heat of MA can be managed by cooling system. Irradiation tests in BOR-60 indicate that 9Cr-ODS possess the satisfactory in-reactor performance as the long life cladding material if homogeneity of alloy element is adequately controlled. Modification of cladding tube fabrication process to ensure homogeneity and further development of measures to control oxygen potential inside the fuel pin are necessary to reach the burn-up target of the FaCT project. (author)

  9. Automatic Assessment of Craniofacial Growth in a Mouse Model of Crouzon Syndrome

    DEFF Research Database (Denmark)

    Thorup, Signe Strann; Larsen, Rasmus; Darvann, Tron Andre

    2009-01-01

    for each mouse-type; growth models were created using linear interpolation and visualized as 3D animations. Spatial regions of significantly different growth were identified using the local False Discovery Rate method, estimating the expected percentage of false predictions in a set of predictions. For all......-rigid volumetric image registration was applied to micro-CT scans of ten 4-week and twenty 6-week euthanized mice for growth modeling. Each age group consisted of 50% normal and 50% Crouzon mice. Four 3D mean shapes, one for each mouse-type and age group were created. Extracting a dense field of growth vectors...... a tool for spatially detailed automatic phenotyping. MAIN OBJECTIVES OF PRESENTATION: We will present a 3D growth model of normal and Crouzon mice, and differences will be statistically and visually compared....

  10. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  11. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  12. The DWARF project

    Science.gov (United States)

    Christopoulou, P. E.

    2013-09-01

    In the era of staggering Kepler data and sophisticated approach of the automatic analysis, how obsolete are the traditional object-by-object multiwavelength photometric observations? Can we apply the new tools of classification, light curve modeling and timing analysis to study the newly detected or/and most interesting Eclipsing Binaries or to detect circumbinary bodies? In this talk, I will discuss developments in this area in the light of the recent DWARF project that promises additional useful science of binary stars within an extensive network of relatively small to medium-size telescopes with apertures of ~20-200 cm.

  13. The function and failure of sensory predictions.

    Science.gov (United States)

    Bansal, Sonia; Ford, Judith M; Spering, Miriam

    2018-04-23

    Humans and other primates are equipped with neural mechanisms that allow them to automatically make predictions about future events, facilitating processing of expected sensations and actions. Prediction-driven control and monitoring of perceptual and motor acts are vital to normal cognitive functioning. This review provides an overview of corollary discharge mechanisms involved in predictions across sensory modalities and discusses consequences of predictive coding for cognition and behavior. Converging evidence now links impairments in corollary discharge mechanisms to neuropsychiatric symptoms such as hallucinations and delusions. We review studies supporting a prediction-failure hypothesis of perceptual and cognitive disturbances. We also outline neural correlates underlying prediction function and failure, highlighting similarities across the visual, auditory, and somatosensory systems. In linking basic psychophysical and psychophysiological evidence of visual, auditory, and somatosensory prediction failures to neuropsychiatric symptoms, our review furthers our understanding of disease mechanisms. © 2018 New York Academy of Sciences.

  14. Automatic Encoding and Language Detection in the GSDL

    Directory of Open Access Journals (Sweden)

    Otakar Pinkas

    2014-10-01

    Full Text Available Automatic detection of encoding and language of the text is part of the Greenstone Digital Library Software (GSDL for building and distributing digital collections. It is developed by the University of Waikato (New Zealand in cooperation with UNESCO. The automatic encoding and language detection in Slavic languages is difficult and it sometimes fails. The aim is to detect cases of failure. The automatic detection in the GSDL is based on n-grams method. The most frequent n-grams for Czech are presented. The whole process of automatic detection in the GSDL is described. The input documents to test collections are plain texts encoded in ISO-8859-1, ISO-8859-2 and Windows-1250. We manually evaluated the quality of automatic detection. To the causes of errors belong the improper language model predominance and the incorrect switch to Windows-1250. We carried out further tests on documents that were more complex.

  15. A proposal for a course of Operations Management for the Degree in Electronics and Automatic

    Directory of Open Access Journals (Sweden)

    Pilar I. Vidal-Carreras

    2017-06-01

    Full Text Available At this work a methodology is proposed for a course of the discipline of Operations Management with a focus on active methodologies in the degree of Electronics and Automatic. For the course is combined: lecture, group work, problem-based learning, project-based learning and presentation of group work. Previous experiences in the same course allow us to conclude the importance of the lecture in this environment in what is the only course of the discipline in all the degree. The importance of feedback in project learning is not easy for large groups such as the case study, suggesting the presentation of group work as a good solution to the problem

  16. Automatic Quantification of Radiographic Wrist Joint Space Width of Patients With Rheumatoid Arthritis.

    Science.gov (United States)

    Huo, Yinghe; Vincken, Koen L; van der Heijde, Desiree; de Hair, Maria J H; Lafeber, Floris P; Viergever, Max A

    2017-11-01

    Objective: Wrist joint space narrowing is a main radiographic outcome of rheumatoid arthritis (RA). Yet, automatic radiographic wrist joint space width (JSW) quantification for RA patients has not been widely investigated. The aim of this paper is to present an automatic method to quantify the JSW of three wrist joints that are least affected by bone overlapping and are frequently involved in RA. These joints are located around the scaphoid bone, viz. the multangular-navicular, capitate-navicular-lunate, and radiocarpal joints. Methods: The joint space around the scaphoid bone is detected by using consecutive searches of separate path segments, where each segment location aids in constraining the subsequent one. For joint margin delineation, first the boundary not affected by X-ray projection is extracted, followed by a backtrace process to obtain the actual joint margin. The accuracy of the quantified JSW is evaluated by comparison with the manually obtained ground truth. Results: Two of the 50 radiographs used for evaluation of the method did not yield a correct path through all three wrist joints. The delineated joint margins of the remaining 48 radiographs were used for JSW quantification. It was found that 90% of the joints had a JSW deviating less than 20% from the mean JSW of manual indications, with the mean JSW error less than 10%. Conclusion: The proposed method is able to automatically quantify the JSW of radiographic wrist joints reliably. The proposed method may aid clinical researchers to study the progression of wrist joint damage in RA studies. Objective: Wrist joint space narrowing is a main radiographic outcome of rheumatoid arthritis (RA). Yet, automatic radiographic wrist joint space width (JSW) quantification for RA patients has not been widely investigated. The aim of this paper is to present an automatic method to quantify the JSW of three wrist joints that are least affected by bone overlapping and are frequently involved in RA. These joints

  17. Simple Automatic File Exchange (SAFE) to Support Low-Cost Spacecraft Operation via the Internet

    Science.gov (United States)

    Baker, Paul; Repaci, Max; Sames, David

    1998-01-01

    Various issues associated with Simple Automatic File Exchange (SAFE) are presented in viewgraph form. Specific topics include: 1) Packet telemetry, Internet IP networks and cost reduction; 2) Basic functions and technical features of SAFE; 3) Project goals, including low-cost satellite transmission to data centers to be distributed via an Internet; 4) Operations with a replicated file protocol; 5) File exchange operation; 6) Ground stations as gateways; 7) Lessons learned from demonstrations and tests with SAFE; and 8) Feedback and future initiatives.

  18. Clustering ERP implementation project activities: a foundation for project size definition

    NARCIS (Netherlands)

    Janssens, G.; Kusters, R.J.; Heemstra, F.J.; Sadiq, A.; Reichert, M.; Schultz, K.; Trienekens, J.J.M.; Moller, C.; Kusters, R.J.

    2007-01-01

    The size of an ERP project can be a useful measurement for predicting the effort needed to complete an ERP implementation project. Because this measurement does not exist, research is needed to find a set of variables which can define the size of an ERP implementation project. This paper shows 21

  19. Some experimental results for an automatic helium liquefier

    International Nuclear Information System (INIS)

    Watanabe, T.; Kudo, T.; Kuraoka, Y.; Sakura, K.; Tsuruga, H.; Watanabe, T.

    1984-01-01

    This chapter describes the testing of an automatic cooldown system. The liquefying machine examined is a CTi Model 1400. The automatic helium gas liquefying system is operated by using sequence control with a programmable controller. The automatic mode is carried out by operation of two compressors. The monitoring system consists of 41 remote sensors. Liquid level is measured by a superconducting level meter. The J-T valve and return valve, which require precise control, are operated by pulse motors. The advantages of the automatic cooldown system are reduced operator man power; temperatures and pressures are changed smoothly, so that the flow chart of automation is simple; and the system makes continuous liquefier operation possible

  20. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  1. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  2. 46 CFR 171.118 - Automatic ventilators and side ports.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Automatic ventilators and side ports. 171.118 Section 171.118 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY... Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  3. 30 CFR 75.1404 - Automatic brakes; speed reduction gear.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic brakes; speed reduction gear. 75.1404... Automatic brakes; speed reduction gear. [Statutory Provisions] Each locomotive and haulage car used in an... permit automatic brakes, locomotives and haulage cars shall be subject to speed reduction gear, or other...

  4. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    Directory of Open Access Journals (Sweden)

    Sven Van Poucke

    Full Text Available With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension. Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM, the ETL process (Extract, Transform, Load was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  5. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    Science.gov (United States)

    Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  6. Automatic Error Recovery in Robot Assembly Operations Using Reverse Execution

    DEFF Research Database (Denmark)

    Laursen, Johan Sund; Schultz, Ulrik Pagh; Ellekilde, Lars-Peter

    2015-01-01

    , in particular for small-batch productions. As an alternative, we propose a system for automatically handling certain classes of errors instead of preventing them. Specifically, we show that many operations can be automatically reversed. Errors can be handled through automatic reverse execution of the control...... program to a safe point, from which forward execution can be resumed. This paper describes the principles behind automatic reversal of robotic assembly operations, and experimentally demonstrates the use of a domain-specific language that supports automatic error handling through reverse execution. Our...

  7. Development of a new model to evaluate the probability of automatic plant trips for pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shimada, Yoshio [Institute of Nuclear Safety System Inc., Mihama, Fukui (Japan); Kawai, Katsunori; Suzuki, Hiroshi [Mitsubishi Heavy Industries Ltd., Tokyo (Japan)

    2001-09-01

    In order to improve the reliability of plant operations for pressurized water reactors, a new fault tree model was developed to evaluate the probability of automatic plant trips. This model consists of fault trees for sixteen systems. It has the following features: (1) human errors and transmission line incidents are modeled by the existing data, (2) the repair of failed components is considered to calculate the failure probability of components, (3) uncertainty analysis is performed by an exact method. From the present results, it is confirmed that the obtained upper and lower bound values of the automatic plant trip probability are within the existing data bound in Japan. Thereby this model can be applicable to the prediction of plant performance and reliability. (author)

  8. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  9. Development of Procedures for Assessing the Impact of Vocational Education Research and Development on Vocational Education (Project IMPACT). Volume 8--A Field Study of Predicting Impact of Research and Development Projects in Vocational and Technical Education.

    Science.gov (United States)

    Malhorta, Man Mohanlal

    As part of Project IMPACT's effort to identify and develop procedures for complying with the impact requirements of Public Law 94-482, a field study was conducted to identify and validate variables and their order of importance in predicting and evaluating impact of research and development (R&D) projects in vocational and technical education.…

  10. EU Framework 6 Project: Predictive Toxicology (PredTox)-overview and outcome

    International Nuclear Information System (INIS)

    Suter, Laura; Schroeder, Susanne; Meyer, Kirstin; Gautier, Jean-Charles; Amberg, Alexander; Wendt, Maria; Gmuender, Hans; Mally, Angela; Boitier, Eric; Ellinger-Ziegelbauer, Heidrun; Matheis, Katja; Pfannkuch, Friedlieb

    2011-01-01

    In this publication, we report the outcome of the integrated EU Framework 6 Project: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and 'omics' technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach of 'omics' and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that 'omics' technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated 'omics' analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity.

  11. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    Energy Technology Data Exchange (ETDEWEB)

    Fan, J; Fan, J; Hu, W; Wang, J [Fudan University Shanghai Cancer Center, Shanghai, Shanghai (China)

    2016-06-15

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditional probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.

  12. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    International Nuclear Information System (INIS)

    Fan, J; Fan, J; Hu, W; Wang, J

    2016-01-01

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditional probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. Design of Low Power Algorithms for Automatic Embedded Analysis of Patch ECG Signals

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt

    , several different cable-free wireless patch-type ECG recorders have recently reached the market. One of these recorders is the ePatch designed by the Danish company DELTA. The extended monitoring period available with the patch recorders has demonstrated to increase the diagnostic yield of outpatient ECG....... Such algorithms could allow the real-time transmission of clinically relevant information to a central monitoring station. The first step in embedded ECG interpretation is the automatic detection of each individual heartbeat. An important part of this project was therefore to design a novel algorithm...

  15. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  16. Automatic shadowing device for electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, F W; Bogitch, S

    1960-01-01

    For the past ten years in the laboratory of the Department of Nuclear Medicine and Radiation Biology at the University of California, and before that at Rochester, New York, every evaporation was done with the aid of an automatic shadowing device. For several months the automatic shadowing device has been available at the Atomic Bomb Casualty Commission (ABCC) Hiroshima, Japan with the modifications described. 1 reference.

  17. On the monitoring and prediction of flash floods in small and medium-sized catchments - the EXTRUSO project

    Science.gov (United States)

    Wiemann, Stefan; Eltner, Anette; Sardemann, Hannes; Spieler, Diana; Singer, Thomas; Thanh Luong, Thi; Janabi, Firas Al; Schütze, Niels; Bernard, Lars; Bernhofer, Christian; Maas, Hans-Gerd

    2017-04-01

    Flash floods regularly cause severe socio-economic damage worldwide. In parallel, climate change is very likely to increase the number of such events, due to an increasing frequency of extreme precipitation events (EASAC 2013). Whereas recent work primarily addresses the resilience of large catchment areas, the major impact of hydro-meteorological extremes caused by heavy precipitation is on small areas. Those are very difficult to observe and predict, due to sparse monitoring networks and only few means for hydro-meteorological modelling, especially in small catchment areas. The objective of the EXTRUSO project is to identify and implement appropriate means to close this gap by an interdisciplinary approach, combining comprehensive research expertise from meteorology, hydrology, photogrammetry and geoinformatics. The project targets innovative techniques for achieving spatio-temporal densified monitoring and simulations for the analysis, prediction and warning of local hydro-meteorological extreme events. The following four aspects are of particular interest: 1. The monitoring, analysis and combination of relevant hydro-meteorological parameters from various sources, including existing monitoring networks, ground radar, specific low-cost sensors and crowdsourcing. 2. The determination of relevant hydro-morphological parameters from different photogrammetric sensors (e.g. camera, laser scanner) and sensor platforms (e.g. UAV (unmanned aerial vehicle) and UWV (unmanned water vehicle)). 3. The continuous hydro-meteorological modelling of precipitation, soil moisture and water flows by means of conceptual and data-driven modelling. 4. The development of a collaborative, web-based service infrastructure as an information and communication point, especially in the case of an extreme event. There are three major applications for the planned information system: First, the warning of local extreme events for the population in potentially affected areas, second, the support

  18. Nuclear Fuel Assembly Assessment Project and Image Categorization

    Energy Technology Data Exchange (ETDEWEB)

    Lindsey, C.S.; Lindblad, T.; Waldemark, K. [Royal Inst. of Tech., Stockholm (Sweden); Hildingsson, Lars [Swedish Nuclear Power Inspectorate, Stockholm (Sweden)

    1998-07-01

    A project has been underway to add digital imaging and processing to the inspection of nuclear fuel by the International Atomic Energy Agency. The ultimate goals are to provide the inspector not only with the advantages of Ccd imaging, such as high sensitivity and digital image enhancements, but also with an intelligent agent that can analyze the images and provide useful information about the fuel assemblies in real time. The project is still in the early stages and several interesting sub-projects have been inspired. Here we give first a review of the work on the fuel assembly image analysis and then give a brief status report on one of these sub-projects that concerns automatic categorization of fuel assembly images. The technique could be of benefit to the general challenge of image categorization

  19. Project Photofly: New 3d Modeling Online Web Service (case Studies and Assessments)

    Science.gov (United States)

    Abate, D.; Furini, G.; Migliori, S.; Pierattini, S.

    2011-09-01

    During summer 2010, Autodesk has released a still ongoing project called Project Photofly, freely downloadable from AutodeskLab web site until August 1 2011. Project Photofly based on computer-vision and photogrammetric principles, exploiting the power of cloud computing, is a web service able to convert collections of photographs into 3D models. Aim of our research was to evaluate the Project Photofly, through different case studies, for 3D modeling of cultural heritage monuments and objects, mostly to identify for which goals and objects it is suitable. The automatic approach will be mainly analyzed.

  20. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  1. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  2. The One to Multiple Automatic High Accuracy Registration of Terrestrial LIDAR and Optical Images

    Science.gov (United States)

    Wang, Y.; Hu, C.; Xia, G.; Xue, H.

    2018-04-01

    The registration of ground laser point cloud and close-range image is the key content of high-precision 3D reconstruction of cultural relic object. In view of the requirement of high texture resolution in the field of cultural relic at present, The registration of point cloud and image data in object reconstruction will result in the problem of point cloud to multiple images. In the current commercial software, the two pairs of registration of the two kinds of data are realized by manually dividing point cloud data, manual matching point cloud and image data, manually selecting a two - dimensional point of the same name of the image and the point cloud, and the process not only greatly reduces the working efficiency, but also affects the precision of the registration of the two, and causes the problem of the color point cloud texture joint. In order to solve the above problems, this paper takes the whole object image as the intermediate data, and uses the matching technology to realize the automatic one-to-one correspondence between the point cloud and multiple images. The matching of point cloud center projection reflection intensity image and optical image is applied to realize the automatic matching of the same name feature points, and the Rodrigo matrix spatial similarity transformation model and weight selection iteration are used to realize the automatic registration of the two kinds of data with high accuracy. This method is expected to serve for the high precision and high efficiency automatic 3D reconstruction of cultural relic objects, which has certain scientific research value and practical significance.

  3. Memory biases in remitted depression: the role of negative cognitions at explicit and automatic processing levels.

    Science.gov (United States)

    Romero, Nuria; Sanchez, Alvaro; Vazquez, Carmelo

    2014-03-01

    Cognitive models propose that depression is caused by dysfunctional schemas that endure beyond the depressive episode, representing vulnerability factors for recurrence. However, research testing negative cognitions linked to dysfunctional schemas in formerly depressed individuals is still scarce. Furthermore, negative cognitions are presumed to be linked to biases in recalling negative self-referent information in formerly depressed individuals, but no studies have directly tested this association. In the present study, we evaluated differences between formerly and never-depressed individuals in several experimental indices of negative cognitions and their associations with the recall of emotional self-referent material. Formerly (n = 30) and never depressed individuals (n = 40) completed measures of explicit (i.e., scrambled sentence test) and automatic (i.e., lexical decision task) processing to evaluate negative cognitions. Furthermore participants completed a self-referent incidental recall task to evaluate memory biases. Formerly compared to never depressed individuals showed greater negative cognitions at both explicit and automatic levels of processing. Results also showed greater recall of negative self-referent information in formerly compared to never-depressed individuals. Finally, individual differences in negative cognitions at both explicit and automatic levels of processing predicted greater recall of negative self-referent material in formerly depressed individuals. Analyses of the relationship between explicit and automatic processing indices and memory biases were correlational and the majority of participants in both groups were women. Our findings provide evidence of negative cognitions in formerly depressed individuals at both automatic and explicit levels of processing that may confer a cognitive vulnerability to depression. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Are traditional cognitive tests useful in predicting clinical success?

    Science.gov (United States)

    Gray, Sarah A; Deem, Lisa P; Straja, Sorin R

    2002-11-01

    The purpose of this research was to determine the predictive value of the Dental Admission Test (DAT) for clinical success using Ackerman's theory of ability determinants of skilled performance. The Ackerman theory is a valid, reliable schema in the applied psychology literature used to predict complex skill acquisition. Inconsistent stimulus-response skill acquisition depends primarily on determinants of cognitive ability. Consistent information-processing tasks have been described as "automatic," in which stimuli and responses are mapped in a manner that allows for complete certainty once the relationships have been learned. It is theorized that the skills necessary for success in the clinical component of dental schools involve a significant amount of automatic processing demands and, as such, student performance in the clinics should begin to converge as task practice is realized and tasks become more consistent. Subtest scores of the DAT of four classes were correlated with final grades in nine clinical courses. Results showed that the DAT subtest scores played virtually no role with regard to the final clinical grades. Based on this information, the DAT scores were determined to be of no predictive value in clinical achievement.

  5. A Unification of Inheritance and Automatic Program Specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2004-01-01

    , inheritance is used to control the automatic application of program specialization to class members during compilation to obtain an efficient implementation. This paper presents the language JUST, which integrates object-oriented concepts, block structure, and techniques from automatic program specialization......The object-oriented style of programming facilitates program adaptation and enhances program genericness, but at the expense of efficiency. Automatic program specialization can be used to generate specialized, efficient implementations for specific scenarios, but requires the program...... to be structured appropriately for specialization and is yet another new concept for the programmer to understand and apply. We have unified automatic program specialization and inheritance into a single concept, and implemented this approach in a modified version of Java named JUST. When programming in JUST...

  6. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  7. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    case studies regarding the analysis of clinical datasets produced in the University Hospital of Catanzaro, Italy. Conclusion DMET Analyzer is a novel tool able to automatically analyse data produced by the DMET-platform in case-control association studies. Using such tool user may avoid wasting time in the manual execution of multiple statistical tests avoiding possible errors and reducing the amount of time needed for a whole experiment. Moreover annotations and the direct link to external databases may increase the biological knowledge extracted. The system is freely available for academic purposes at: https://sourceforge.net/projects/dmetanalyzer/files/

  8. Metabolic changes in occipital lobe epilepsy with automatisms.

    Science.gov (United States)

    Wong, Chong H; Mohamed, Armin; Wen, Lingfeng; Eberl, Stefan; Somerville, Ernest; Fulham, Michael; Bleasel, Andrew F

    2014-01-01

    Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE) that may reflect propagation of ictal discharge during seizures with automatisms. Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron-emission tomography ((18)F-FDG-PET) between 1994 and 2004 were divided into two groups (with and without automatisms during seizure). Significant regions of hypometabolism were identified by comparing (18)F-FDG-PET results from each group with 16 healthy controls by using statistical parametric mapping. Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe. We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  9. Optoacoustic temperature determination and automatic coagulation control in rabbits

    Science.gov (United States)

    Schlott, Kerstin; Koinzer, Stefan; Ptaszynski, Lars; Luft, Susanne; Baade, Alex; Bever, Marco; Roider, Johann; Birngruber, Reginald; Brinkmann, Ralf

    2011-03-01

    Retinal laser photocoagulation is an established treatment method for many retinal diseases like macula edema or diabetic retinopathy. The selection of the laser parameters is so far based on post treatment evaluation of the lesion size and strength. Due to local pigment variations in the fundus and individual transmission the same laser parameters often lead to an overtreatment. Optoacoustic allows a non invasive monitoring of the retinal temperature increase during retinal laser irradiation by measuring the temperature dependent pressure amplitudes, which are induced by short probe laser pulses. A 75 ns/ 523 nm Nd:YLF was used as a probe laser at a repetition rate of 1 kHz, and a cw / 532 nm treatment laser for heating. A contact lens was modified with a ring-shaped ultrasonic transducer to detect the pressure waves at the cornea. Temperatures were collected for irradiations leading to soft or invisible lesions. Based on this data the threshold for denaturation was found. By analyzing the initial temperature increase, the further temperature development during irradiation could be predicted. An algorithm was found to calculate the irradiation time, which is needed for a soft lesion formation, from the temperature curve. By this it was possible to provide a real-time dosimetry by automatically switching off the treatment laser after the calculated irradiation time. Automatically controlled coagulations appear softer and more uniformly.

  10. Implementation of short-term prediction

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L; Joensen, A; Giebel, G [and others

    1999-03-01

    This paper will giver a general overview of the results from a EU JOULE funded project (`Implementing short-term prediction at utilities`, JOR3-CT95-0008). Reference will be given to specialised papers where applicable. The goal of the project was to implement wind farm power output prediction systems in operational environments at a number of utilities in Europe. Two models were developed, one by Risoe and one by the Technical University of Denmark (DTU). Both prediction models used HIRLAM predictions from the Danish Meteorological Institute (DMI). (au) EFP-94; EU-JOULE. 11 refs.

  11. 30 CFR 75.1403-4 - Criteria-Automatic elevators.

    Science.gov (United States)

    2010-07-01

    ... appropriate on automatic elevators which will automatically shut-off the power and apply the brakes in the... telephone or other effective communication system by which aid or assistance can be obtained promptly. ...

  12. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  13. Comparison of Automatic Classifiers’ Performances using Word-based Feature Extraction Techniques in an E-government setting

    OpenAIRE

    Marin Rodenas, Alfonso

    2011-01-01

    Projecte realitzat mitjançant programa de mobilitat. KUNGLIGA TEKNISKA HÖGSKOLAN, STOCKHOLM Nowadays email is commonly used by citizens to establish communication with their government. On the received emails, governments deal with some common queries and subjects which some handling officers have to manually answer. Automatic email classification of the incoming emails allows to increase the communication efficiency by decreasing the delay between the query and its response. This thesis t...

  14. Using sensor data patterns from an automatic milking system to develop predictive variables for classifying clinical mastitis and abnormal milk

    NARCIS (Netherlands)

    Kamphuis, A.; Pietersma, D.; Tol, van der R.; Wiedermann, M.; Hogeveen, H.

    2008-01-01

    Dairy farmers using automatic milking are able to manage mastitis successfully with the help of mastitis attention lists. These attention lists are generated with mastitis detection models that make use of sensor data obtained throughout each quarter milking. The models tend to be limited to using

  15. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  16. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  17. Automatic needle insertion diminishes pain during growth hormone injection

    DEFF Research Database (Denmark)

    Main, K M; Jørgensen, J T; Hertel, N T

    1995-01-01

    prototype pens for GH administration, providing either manual or automatic sc needle insertion, using a combined visual analogue/facial scale and a five-item scale in 18 children. With the automatic pen there was a significantly lower maximum pain score compared with the manual pen (median 28.5 versus 52.......0 mm) as well as a lower mean pain score (mean 13.7 versus 23.5 mm). The five-item scale revealed that automatic needle insertion was significantly less painful than manual insertion and 13 patients chose to continue treatment with the automatic pen. In conclusion, pain during GH injection can...

  18. Two-spinor description of massive particles and relativistic spin projection operators

    Science.gov (United States)

    Isaev, A. P.; Podoinitsyn, M. A.

    2018-04-01

    On the basis of the Wigner unitary representations of the covering group ISL (2 , C) of the Poincaré group, we obtain spin-tensor wave functions of free massive particles with arbitrary spin. The wave functions automatically satisfy the Dirac-Pauli-Fierz equations. In the framework of the two-spinor formalism we construct spin-vectors of polarizations and obtain conditions that fix the corresponding relativistic spin projection operators (Behrends-Fronsdal projection operators). With the help of these conditions we find explicit expressions for relativistic spin projection operators for integer spins (Behrends-Fronsdal projection operators) and then find relativistic spin projection operators for half integer spins. These projection operators determine the numerators in the propagators of fields of relativistic particles. We deduce generalizations of the Behrends-Fronsdal projection operators for arbitrary space-time dimensions D > 2.

  19. Development of automatic ultrasonic testing system and its application

    International Nuclear Information System (INIS)

    Oh, Sang Hong; Matsuura, Toshihiko; Iwata, Ryusuke; Nakagawa, Michio; Horikawa, Kohsuke; Kim, You Chul

    1997-01-01

    The radiographic testing (RT) has been usually applied to a nondestructive testing, which is carried out on purpose to detect internal defects at welded joints of a penstock. In the case that RT could not be applied to, the ultrasonic testing (UT) was performed. UT was generally carried out by manual scanning and the inspections data were recorded by the inspector in a site. So, as a weak point, there was no objective inspection records correspond to films of RT. It was expected that the automatic ultrasonic testing system by which automatic scanning and automatic recording are possible was developed. In this respect, the automatic ultrasonic testing system was developed. Using newly developed the automatic ultrasonic testing system, test results to the circumferential welded joints of the penstock at a site were shown in this paper.

  20. Automatic diagnostic system for measuring ocular refractive errors

    Science.gov (United States)

    Ventura, Liliane; Chiaradia, Caio; de Sousa, Sidney J. F.; de Castro, Jarbas C.

    1996-05-01

    Ocular refractive errors (myopia, hyperopia and astigmatism) are automatic and objectively determined by projecting a light target onto the retina using an infra-red (850 nm) diode laser. The light vergence which emerges from the eye (light scattered from the retina) is evaluated in order to determine the corresponding ametropia. The system basically consists of projecting a target (ring) onto the retina and analyzing the scattered light with a CCD camera. The light scattered by the eye is divided into six portions (3 meridians) by using a mask and a set of six prisms. The distance between the two images provided by each of the meridians, leads to the refractive error of the referred meridian. Hence, it is possible to determine the refractive error at three different meridians, which gives the exact solution for the eye's refractive error (spherical and cylindrical components and the axis of the astigmatism). The computational basis used for the image analysis is a heuristic search, which provides satisfactory calculation times for our purposes. The peculiar shape of the target, a ring, provides a wider range of measurement and also saves parts of the retina from unnecessary laser irradiation. Measurements were done in artificial and in vivo eyes (using cicloplegics) and the results were in good agreement with the retinoscopic measurements.