WorldWideScience

Sample records for project automatic prediction

  1. Planning Complex Projects Automatically

    Science.gov (United States)

    Henke, Andrea L.; Stottler, Richard H.; Maher, Timothy P.

    1995-01-01

    Automated Manifest Planner (AMP) computer program applies combination of artificial-intelligence techniques to assist both expert and novice planners, reducing planning time by orders of magnitude. Gives planners flexibility to modify plans and constraints easily, without need for programming expertise. Developed specifically for planning space shuttle missions 5 to 10 years ahead, with modifications, applicable in general to planning other complex projects requiring scheduling of activities depending on other activities and/or timely allocation of resources. Adaptable to variety of complex scheduling problems in manufacturing, transportation, business, architecture, and construction.

  2. Do Judgments of Learning Predict Automatic Influences of Memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-01-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked…

  3. Automatic Train Operation Using Autonomic Prediction of Train Runs

    Science.gov (United States)

    Asuka, Masashi; Kataoka, Kenji; Komaya, Kiyotoshi; Nishida, Syogo

    In this paper, we present an automatic train control method adaptable to disturbed train traffic conditions. The proposed method presumes transmission of detected time of a home track clearance to trains approaching to the station by employing equipment of Digital ATC (Automatic Train Control). Using the information, each train controls its acceleration by the method that consists of two approaches. First, by setting a designated restricted speed, the train controls its running time to arrive at the next station in accordance with predicted delay. Second, the train predicts the time at which it will reach the current braking pattern generated by Digital ATC, along with the time when the braking pattern transits ahead. By comparing them, the train correctly chooses the coasting drive mode in advance to avoid deceleration due to the current braking pattern. We evaluated the effectiveness of the proposed method regarding driving conditions, energy consumption and reduction of delays by simulation.

  4. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data from a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating using predictive computational...

  5. The PredictAD project

    DEFF Research Database (Denmark)

    Antila, Kari; Lötjönen, Jyrki; Thurfjell, Lennart

    2013-01-01

    Alzheimer's disease (AD) is the most common cause of dementia affecting 36 million people worldwide. As the demographic transition in the developed countries progresses towards older population, the worsening ratio of workers per retirees and the growing number of patients with age-related illnes...... candidates and implement the framework in software. The results are currently used in several research projects, licensed to commercial use and being tested for clinical use in several trials....... objective of the PredictAD project was to find and integrate efficient biomarkers from heterogeneous patient data to make early diagnosis and to monitor the progress of AD in a more efficient, reliable and objective manner. The project focused on discovering biomarkers from biomolecular data...

  6. Automatic tools for enhancing the collaborative experience in large projects

    International Nuclear Information System (INIS)

    Bourilkov, D; Rodriquez, J L

    2014-01-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  7. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is a key component of solutions to many healthcare problems. Among all predictive modeling approaches, machine learning methods often achieve the highest prediction accuracy, but suffer from a long-standing open problem precluding their widespread use in healthcare. Most machine learning models give no explanation for their prediction results, whereas interpretability is essential for a predictive model to be adopted in typical healthcare settings. This paper presents the first complete method for automatically explaining results for any machine learning predictive model without degrading accuracy. We did a computer coding implementation of the method. Using the electronic medical record data set from the Practice Fusion diabetes classification competition containing patient records from all 50 states in the United States, we demonstrated the method on predicting type 2 diabetes diagnosis within the next year. For the champion machine learning model of the competition, our method explained prediction results for 87.4 % of patients who were correctly predicted by the model to have type 2 diabetes diagnosis within the next year. Our demonstration showed the feasibility of automatically explaining results for any machine learning predictive model without degrading accuracy.

  8. Automatic prediction of facial trait judgments: appearance vs. structural models.

    Directory of Open Access Journals (Sweden)

    Mario Rojas

    Full Text Available Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a derive a facial trait judgment model from training data and b predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations and classification rules (4 rules suggest that a prediction of perception of facial traits is learnable by both holistic and structural approaches; b the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.

  9. Prediction Governors for Input-Affine Nonlinear Systems and Application to Automatic Driving Control

    Directory of Open Access Journals (Sweden)

    Yuki Minami

    2018-04-01

    Full Text Available In recent years, automatic driving control has attracted attention. To achieve a satisfactory driving control performance, the prediction accuracy of the traveling route is important. If a highly accurate prediction method can be used, an accurate traveling route can be obtained. Despite the considerable efforts that have been invested in improving prediction methods, prediction errors do occur in general. Thus, a method to minimize the influence of prediction errors on automatic driving control systems is required. This need motivated us to focus on the design of a mechanism for shaping prediction signals, which is called a prediction governor. In this study, we first extended our previous study to the input-affine nonlinear system case. Then, we analytically derived a solution to an optimal design problem of prediction governors. Finally, we applied the solution to an automatic driving control system, and demonstrated its usefulness through a numerical example and an experiment using a radio controlled car.

  10. Decadal climate prediction (project GCEP).

    Science.gov (United States)

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability.

  11. Piloted Simulation Evaluation of a Model-Predictive Automatic Recovery System to Prevent Vehicle Loss of Control on Approach

    Science.gov (United States)

    Litt, Jonathan S.; Liu, Yuan; Sowers, Thomas S.; Owen, A. Karl; Guo, Ten-Huei

    2014-01-01

    This paper describes a model-predictive automatic recovery system for aircraft on the verge of a loss-of-control situation. The system determines when it must intervene to prevent an imminent accident, resulting from a poor approach. It estimates the altitude loss that would result from a go-around maneuver at the current flight condition. If the loss is projected to violate a minimum altitude threshold, the maneuver is automatically triggered. The system deactivates to allow landing once several criteria are met. Piloted flight simulator evaluation showed the system to provide effective envelope protection during extremely unsafe landing attempts. The results demonstrate how flight and propulsion control can be integrated to recover control of the vehicle automatically and prevent a potential catastrophe.

  12. The Predictive Validity of Projective Measures.

    Science.gov (United States)

    Suinn, Richard M.; Oskamp, Stuart

    Written for use by clinical practitioners as well as psychological researchers, this book surveys recent literature (1950-1965) on projective test validity by reviewing and critically evaluating studies which shed light on what may reliably be predicted from projective test results. Two major instruments are covered: the Rorschach and the Thematic…

  13. Lynx: Automatic Elderly Behavior Prediction in Home Telecare

    Science.gov (United States)

    Lopez-Guede, Jose Manuel; Moreno-Fernandez-de-Leceta, Aitor; Martinez-Garcia, Alexeiw; Graña, Manuel

    2015-01-01

    This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder's daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense) or if something is wrong relative to the user's health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%. PMID:26783514

  14. Lynx: Automatic Elderly Behavior Prediction in Home Telecare

    Directory of Open Access Journals (Sweden)

    Jose Manuel Lopez-Guede

    2015-01-01

    Full Text Available This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder’s daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense or if something is wrong relative to the user’s health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%.

  15. CERAPP: Collaborative estrogen receptor activity prediction project

    DEFF Research Database (Denmark)

    Mansouri, Kamel; Abdelaziz, Ahmed; Rybacka, Aleksandra

    2016-01-01

    ). Risk assessors need tools to prioritize chemicals for evaluation in costly in vivo tests, for instance, within the U.S. EPA Endocrine Disruptor Screening Program. oBjectives: We describe a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project...... States and Europe to predict ER activity of a common set of 32,464 chemical structures. Quantitative structure-activity relationship models and docking approaches were employed, mostly using a common training set of 1,677 chemical structures provided by the U.S. EPA, to build a total of 40 categorical......: Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. Out of the 32,464 chemicals, the consensus model predicted 4,001 chemicals (12.3%) as high priority actives and 6,742 potential actives (20.8%) to be considered for further testing. conclusion: This project demonstrated...

  16. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Automatic Offline Formulation of Robust Model Predictive Control Based on Linear Matrix Inequalities Method

    Directory of Open Access Journals (Sweden)

    Longge Zhang

    2013-01-01

    Full Text Available Two automatic robust model predictive control strategies are presented for uncertain polytopic linear plants with input and output constraints. A sequence of nested geometric proportion asymptotically stable ellipsoids and controllers is constructed offline first. Then the feedback controllers are automatically selected with the receding horizon online in the first strategy. Finally, a modified automatic offline robust MPC approach is constructed to improve the closed system's performance. The new proposed strategies not only reduce the conservatism but also decrease the online computation. Numerical examples are given to illustrate their effectiveness.

  18. Automatic stimulation of experiments and learning based on prediction failure recognition

    NARCIS (Netherlands)

    Juarez Cordova, A.G.; Kahl, B.; Henne, T.; Prassler, E.

    2009-01-01

    In this paper we focus on the task of automatically and autonomously initiating experimentation and learning based on the recognition of prediction failure. We present a mechanism that utilizes conceptual knowledge to predict the outcome of robot actions, observes their execution and indicates when

  19. EVA: continuous automatic evaluation of protein structure prediction servers.

    Science.gov (United States)

    Eyrich, V A; Martí-Renom, M A; Przybylski, D; Madhusudhan, M S; Fiser, A; Pazos, F; Valencia, A; Sali, A; Rost, B

    2001-12-01

    Evaluation of protein structure prediction methods is difficult and time-consuming. Here, we describe EVA, a web server for assessing protein structure prediction methods, in an automated, continuous and large-scale fashion. Currently, EVA evaluates the performance of a variety of prediction methods available through the internet. Every week, the sequences of the latest experimentally determined protein structures are sent to prediction servers, results are collected, performance is evaluated, and a summary is published on the web. EVA has so far collected data for more than 3000 protein chains. These results may provide valuable insight to both developers and users of prediction methods. http://cubic.bioc.columbia.edu/eva. eva@cubic.bioc.columbia.edu

  20. Shape: automatic conformation prediction of carbohydrates using a genetic algorithm

    Directory of Open Access Journals (Sweden)

    Rosen Jimmy

    2009-09-01

    Full Text Available Abstract Background Detailed experimental three dimensional structures of carbohydrates are often difficult to acquire. Molecular modelling and computational conformation prediction are therefore commonly used tools for three dimensional structure studies. Modelling procedures generally require significant training and computing resources, which is often impractical for most experimental chemists and biologists. Shape has been developed to improve the availability of modelling in this field. Results The Shape software package has been developed for simplicity of use and conformation prediction performance. A trivial user interface coupled to an efficient genetic algorithm conformation search makes it a powerful tool for automated modelling. Carbohydrates up to a few hundred atoms in size can be investigated on common computer hardware. It has been shown to perform well for the prediction of over four hundred bioactive oligosaccharides, as well as compare favourably with previously published studies on carbohydrate conformation prediction. Conclusion The Shape fully automated conformation prediction can be used by scientists who lack significant modelling training, and performs well on computing hardware such as laptops and desktops. It can also be deployed on computer clusters for increased capacity. The prediction accuracy under the default settings is good, as it agrees well with experimental data and previously published conformation prediction studies. This software is available both as open source and under commercial licenses.

  1. Automatic fault tree generation in the EPR PSA project

    International Nuclear Information System (INIS)

    Villatte, N; Nonclercq, P.; Taupy, S.

    2012-01-01

    Tools (KB3 and Atelier EPS) have been developed at EDF to assist the analysts in building fault trees for PSA (Probabilistic Safety Assessment) and importing them into RiskSpectrum (RiskSpectrum is a Swedish code used at EDF for PSA). System modelling is performed using KB3 software with a knowledge base describing generic classes of components with their behaviour and failure modes. Using these classes of components, the analyst can describe (using a graphical system editor): a simplified system diagram from the mechanical system drawings and functional descriptions, the missions of the studied system (in a form of high level fault trees) and its different configurations for the missions. He can also add specific knowledge about the system. Then, the analyst chooses missions and configurations to specify and launch fault trees generations. From the system description, KB3 produces by backward-chaining on rules, detailed system fault trees. These fault trees are finally imported into RiskSpectrum (they are converted by Atelier EPS into a format readable by RiskSpectrum). KB3 and Atelier EPS have been used to create the majority of the fault trees for the EDF EPR Probabilistic Safety Analysis conducted from November 2009 to March 2010. 25 systems were modelled, and 127 fault trees were automatically generated in a rather short time by different analysts with the help of these tools. A feedback shows a lot of advantages to use KB3 and Atelier EPS: homogeneity and consistency between the different generated fault trees, traceability of modelling, control of modelling and last but not least: the automation of detailed fault tree creation relieves the human analyst of this tedious task so that he can focus his attention on more important tasks: modelling the failure of a function. This industrial application has also helped us gather an interesting feedback from the analysts that should help us improve the handling of the tools. We propose in this paper indeed some

  2. Automatic transfer function design for medical visualization using visibility distributions and projective color mapping.

    Science.gov (United States)

    Cai, Lile; Tay, Wei-Liang; Nguyen, Binh P; Chui, Chee-Kong; Ong, Sim-Heng

    2013-01-01

    Transfer functions play a key role in volume rendering of medical data, but transfer function manipulation is unintuitive and can be time-consuming; achieving an optimal visualization of patient anatomy or pathology is difficult. To overcome this problem, we present a system for automatic transfer function design based on visibility distribution and projective color mapping. Instead of assigning opacity directly based on voxel intensity and gradient magnitude, the opacity transfer function is automatically derived by matching the observed visibility distribution to a target visibility distribution. An automatic color assignment scheme based on projective mapping is proposed to assign colors that allow for the visual discrimination of different structures, while also reflecting the degree of similarity between them. When our method was tested on several medical volumetric datasets, the key structures within the volume were clearly visualized with minimal user intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Specific predictive power of automatic spider-related affective associations for controllable and uncontrollable fear responses toward spiders

    NARCIS (Netherlands)

    Huijdlng, J; de Jong, PJ; Huijding, J.

    This study examined the predictive power of automatically activated spider-related affective associations for automatic and controllable fear responses. The Extrinsic Affective Simon Task (EAST; De Houwer, 2003) was used to indirectly assess automatic spider fear-related associations. The EAST and

  4. Automatic Code Checking Applied to Fire Fighting and Panic Projects in a BIM Environment - BIMSCIP

    Directory of Open Access Journals (Sweden)

    Marcelo Franco Porto

    2017-06-01

    Full Text Available This work presents a computational implementation of an automatic conformity verification of building projects using a 3D modeling platform for BIM. This program was developed in C# language and based itself on the 9th Technical Instruction from Military Fire Brigade of the State of Minas Gerais which covers regulations of fire load in buildings and hazardous areas.

  5. Team collaborative innovation management based on primary pipes automatic welding project

    International Nuclear Information System (INIS)

    Li Jing; Wang Dong; Zhang Ke

    2012-01-01

    The welding quality of primary pipe directly affects the safe operation of nuclear power plants. Primary pipe automatic welding, first of its kind in China, is a complex systematic project involving many facets, such as design, manufacturing, material, and on-site construction. A R and D team was formed by China Guangdong Nuclear Power Engineering Co., Ltd. (CNPEC) together with other domestic nuclear power design institutes, and manufacturing and construction enterprises. According to the characteristics of nuclear power plant construction, and adopting team collaborative innovation management mode, through project co-ordination, resources allocation and building production, education and research collaborative innovation platform, CNPEC successfully developed the primary pipe automatic welding technique which has been widely applied to the construction of nuclear power plant, creating considerable economic benefits. (authors)

  6. US Climate Variability and Predictability Project

    Energy Technology Data Exchange (ETDEWEB)

    Patterson, Mike [University Corporation for Atmospheric Research (UCAR), Boulder, CO (United States)

    2017-11-14

    The US CLIVAR Project Office administers the US CLIVAR Program with its mission to advance understanding and prediction of climate variability and change across timescales with an emphasis on the role of the ocean and its interaction with other elements of the Earth system. The Project Office promotes and facilitates scientific collaboration within the US and international climate and Earth science communities, addressing priority topics from subseasonal to centennial climate variability and change; the global energy imbalance; the ocean’s role in climate, water, and carbon cycles; climate and weather extremes; and polar climate changes. This project provides essential one-year support of the Project Office, enabling the participation of US scientists in the meetings of the US CLIVAR bodies that guide scientific planning and implementation, including the scientific steering committee that establishes program goals and evaluates progress of activities to address them, the science team of funded investigators studying the ocean overturning circulation in the Atlantic, and two working groups tackling the priority research topics of Arctic change influence on midlatitude climate and weather extremes and the decadal-scale widening of the tropical belt.

  7. The MELANIE project: from a biopsy to automatic protein map interpretation by computer.

    Science.gov (United States)

    Appel, R D; Hochstrasser, D F; Funk, M; Vargas, J R; Pellegrini, C; Muller, A F; Scherrer, J R

    1991-10-01

    The goals of the MELANIE project are to determine if disease-associated patterns can be detected in high resolution two-dimensional polyacrylamide gel electrophoresis (HR 2D-PAGE) images and if a diagnosis can be established automatically by computer. The ELSIE/MELANIE system is a set of computer programs which automatically detect, quantify, and compare protein spots shown on HR 2D-PAGE images. Classification programs help the physician to find disease-associated patterns from a given set of two-dimensional gel electrophoresis images and to form diagnostic rules. Prototype expert systems that use these rules to establish a diagnosis from new HR 2D-PAGE images have been developed. They successfully diagnosed cirrhosis of the liver and were able to distinguish a variety of cancer types from biopsies known to be cancerous.

  8. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    Science.gov (United States)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  9. A Domain Specific Embedded Language in C++ for Automatic Differentiation, Projection, Integration and Variational Formulations

    Directory of Open Access Journals (Sweden)

    Christophe Prud'homme

    2006-01-01

    Full Text Available In this article, we present a domain specific embedded language in C++ that can be used in various contexts such as numerical projection onto a functional space, numerical integration, variational formulations and automatic differentiation. Albeit these tools operate in different ways, the language overcomes this difficulty by decoupling expression constructions from evaluation. The language is implemented using expression templates and meta-programming techniques and uses various Boost libraries. The language is exercised on a number of non-trivial examples and a benchmark presents the performance behavior on a few test problems.

  10. An Automatic Prediction of Epileptic Seizures Using Cloud Computing and Wireless Sensor Networks.

    Science.gov (United States)

    Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar

    2016-11-01

    Epilepsy is one of the most common neurological disorders which is characterized by the spontaneous and unforeseeable occurrence of seizures. An automatic prediction of seizure can protect the patients from accidents and save their life. In this article, we proposed a mobile-based framework that automatically predict seizures using the information contained in electroencephalography (EEG) signals. The wireless sensor technology is used to capture the EEG signals of patients. The cloud-based services are used to collect and analyze the EEG data from the patient's mobile phone. The features from the EEG signal are extracted using the fast Walsh-Hadamard transform (FWHT). The Higher Order Spectral Analysis (HOSA) is applied to FWHT coefficients in order to select the features set relevant to normal, preictal and ictal states of seizure. We subsequently exploit the selected features as input to a k-means classifier to detect epileptic seizure states in a reasonable time. The performance of the proposed model is tested on Amazon EC2 cloud and compared in terms of execution time and accuracy. The findings show that with selected HOS based features, we were able to achieve a classification accuracy of 94.6 %.

  11. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  12. Gene prediction using the Self-Organizing Map: automatic generation of multiple gene models.

    Science.gov (United States)

    Mahony, Shaun; McInerney, James O; Smith, Terry J; Golden, Aaron

    2004-03-05

    Many current gene prediction methods use only one model to represent protein-coding regions in a genome, and so are less likely to predict the location of genes that have an atypical sequence composition. It is likely that future improvements in gene finding will involve the development of methods that can adequately deal with intra-genomic compositional variation. This work explores a new approach to gene-prediction, based on the Self-Organizing Map, which has the ability to automatically identify multiple gene models within a genome. The current implementation, named RescueNet, uses relative synonymous codon usage as the indicator of protein-coding potential. While its raw accuracy rate can be less than other methods, RescueNet consistently identifies some genes that other methods do not, and should therefore be of interest to gene-prediction software developers and genome annotation teams alike. RescueNet is recommended for use in conjunction with, or as a complement to, other gene prediction methods.

  13. Bottlenecks in Software Defect Prediction Implementation in Industrial Projects

    OpenAIRE

    Hryszko Jarosław; Madeyski Lech

    2015-01-01

    Case studies focused on software defect prediction in real, industrial software development projects are extremely rare. We report on dedicated R&D project established in cooperation between Wroclaw University of Technology and one of the leading automotive software development companies to research possibilities of introduction of software defect prediction using an open source, extensible software measurement and defect prediction framework called DePress (Defect Prediction in Software Syst...

  14. Predicting Software Projects Cost Estimation Based on Mining Historical Data

    OpenAIRE

    Najadat, Hassan; Alsmadi, Izzat; Shboul, Yazan

    2012-01-01

    In this research, a hybrid cost estimation model is proposed to produce a realistic prediction model that takes into consideration software project, product, process, and environmental elements. A cost estimation dataset is built from a large number of open source projects. Those projects are divided into three domains: communication, finance, and game projects. Several data mining techniques are used to classify software projects in terms of their development complexity. Data mining techniqu...

  15. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  16. Automatic generation of predictive dynamic models reveals nuclear phosphorylation as the key Msn2 control mechanism.

    Science.gov (United States)

    Sunnåker, Mikael; Zamora-Sillero, Elias; Dechant, Reinhard; Ludwig, Christina; Busetto, Alberto Giovanni; Wagner, Andreas; Stelling, Joerg

    2013-05-28

    Predictive dynamical models are critical for the analysis of complex biological systems. However, methods to systematically develop and discriminate among systems biology models are still lacking. We describe a computational method that incorporates all hypothetical mechanisms about the architecture of a biological system into a single model and automatically generates a set of simpler models compatible with observational data. As a proof of principle, we analyzed the dynamic control of the transcription factor Msn2 in Saccharomyces cerevisiae, specifically the short-term mechanisms mediating the cells' recovery after release from starvation stress. Our method determined that 12 of 192 possible models were compatible with available Msn2 localization data. Iterations between model predictions and rationally designed phosphoproteomics and imaging experiments identified a single-circuit topology with a relative probability of 99% among the 192 models. Model analysis revealed that the coupling of dynamic phenomena in Msn2 phosphorylation and transport could lead to efficient stress response signaling by establishing a rate-of-change sensor. Similar principles could apply to mammalian stress response pathways. Systematic construction of dynamic models may yield detailed insight into nonobvious molecular mechanisms.

  17. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    Science.gov (United States)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  18. Predicting heat stress index in Sasso hens using automatic linear modeling and artificial neural network

    Science.gov (United States)

    Yakubu, A.; Oluremi, O. I. A.; Ekpo, E. I.

    2018-03-01

    There is an increasing use of robust analytical algorithms in the prediction of heat stress. The present investigation therefore, was carried out to forecast heat stress index (HSI) in Sasso laying hens. One hundred and sixty seven records on the thermo-physiological parameters of the birds were utilized. They were reared on deep litter and battery cage systems. Data were collected when the birds were 42- and 52-week of age. The independent variables fitted were housing system, age of birds, rectal temperature (RT), pulse rate (PR), and respiratory rate (RR). The response variable was HSI. Data were analyzed using automatic linear modeling (ALM) and artificial neural network (ANN) procedures. The ALM model building method involved Forward Stepwise using the F Statistic criterion. As regards ANN, multilayer perceptron (MLP) with back-propagation network was used. The ANN network was trained with 90% of the data set while 10% were dedicated to testing for model validation. RR and PR were the two parameters of utmost importance in the prediction of HSI. However, the fractional importance of RR was higher than that of PR in both ALM (0.947 versus 0.053) and ANN (0.677 versus 0.274) models. The two models also predicted HSI effectively with high degree of accuracy [r = 0.980, R 2 = 0.961, adjusted R 2 = 0.961, and RMSE = 0.05168 (ALM); r = 0.983, R 2 = 0.966; adjusted R 2 = 0.966, and RMSE = 0.04806 (ANN)]. The present information may be exploited in the development of a heat stress chart based largely on RR. This may aid detection of thermal discomfort in a poultry house under tropical and subtropical conditions.

  19. Automatic evidence quality prediction to support evidence-based decision making.

    Science.gov (United States)

    Sarker, Abeed; Mollá, Diego; Paris, Cécile

    2015-06-01

    Evidence-based medicine practice requires practitioners to obtain the best available medical evidence, and appraise the quality of the evidence when making clinical decisions. Primarily due to the plethora of electronically available data from the medical literature, the manual appraisal of the quality of evidence is a time-consuming process. We present a fully automatic approach for predicting the quality of medical evidence in order to aid practitioners at point-of-care. Our approach extracts relevant information from medical article abstracts and utilises data from a specialised corpus to apply supervised machine learning for the prediction of the quality grades. Following an in-depth analysis of the usefulness of features (e.g., publication types of articles), they are extracted from the text via rule-based approaches and from the meta-data associated with the articles, and then applied in the supervised classification model. We propose the use of a highly scalable and portable approach using a sequence of high precision classifiers, and introduce a simple evaluation metric called average error distance (AED) that simplifies the comparison of systems. We also perform elaborate human evaluations to compare the performance of our system against human judgments. We test and evaluate our approaches on a publicly available, specialised, annotated corpus containing 1132 evidence-based recommendations. Our rule-based approach performs exceptionally well at the automatic extraction of publication types of articles, with F-scores of up to 0.99 for high-quality publication types. For evidence quality classification, our approach obtains an accuracy of 63.84% and an AED of 0.271. The human evaluations show that the performance of our system, in terms of AED and accuracy, is comparable to the performance of humans on the same data. The experiments suggest that our structured text classification framework achieves evaluation results comparable to those of human performance

  20. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    Science.gov (United States)

    Lawrence N. Hudson; Joseph Wunderle M.; And Others

    2016-01-01

    The PREDICTS project—Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)—has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to...

  1. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    NARCIS (Netherlands)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Alhusseini, Tamera I; Bedford, Felicity E; Bennett, Dominic J; Booth, Hollie; Burton, Victoria J; Chng, Charlotte W T; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Emerson, Susan R; Gao, Di; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; Pask-Hale, Gwilym D; Pynegar, Edwin L; Robinson, Alexandra N; Sanchez-Ortiz, Katia; Senior, Rebecca A; Simmons, Benno I; White, Hannah J; Zhang, Hanbin; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Albertos, Belén; Alcala, E L; Del Mar Alguacil, Maria; Alignier, Audrey; Ancrenaz, Marc; Andersen, Alan N; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Arroyo-Rodríguez, Víctor; Aumann, Tom; Axmacher, Jan C; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Bakayoko, Adama; Báldi, András; Banks, John E; Baral, Sharad K; Barlow, Jos; Barratt, Barbara I P; Barrico, Lurdes; Bartolommei, Paola; Barton, Diane M; Basset, Yves; Batáry, Péter; Bates, Adam J; Baur, Bruno; Bayne, Erin M; Beja, Pedro; Benedick, Suzan; Berg, Åke; Bernard, Henry; Berry, Nicholas J; Bhatt, Dinesh; Bicknell, Jake E; Bihn, Jochen H; Blake, Robin J; Bobo, Kadiri S; Bóçon, Roberto; Boekhout, Teun; Böhning-Gaese, Katrin; Bonham, Kevin J; Borges, Paulo A V; Borges, Sérgio H; Boutin, Céline; Bouyer, Jérémy; Bragagnolo, Cibele; Brandt, Jodi S; Brearley, Francis Q; Brito, Isabel; Bros, Vicenç; Brunet, Jörg; Buczkowski, Grzegorz; Buddle, Christopher M; Bugter, Rob; Buscardo, Erika; Buse, Jörn; Cabra-García, Jimmy; Cáceres, Nilton C; Cagle, Nicolette L; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Caparrós, Rut; Cardoso, Pedro; Carpenter, Dan; Carrijo, Tiago F; Carvalho, Anelena L; Cassano, Camila R; Castro, Helena; Castro-Luna, Alejandro A; Rolando, Cerda B; Cerezo, Alexis; Chapman, Kim Alan; Chauvat, Matthieu; Christensen, Morten; Clarke, Francis M; Cleary, Daniel F R; Colombo, Giorgio; Connop, Stuart P; Craig, Michael D; Cruz-López, Leopoldo; Cunningham, Saul A; D'Aniello, Biagio; D'Cruze, Neil; da Silva, Pedro Giovâni; Dallimer, Martin; Danquah, Emmanuel; Darvill, Ben; Dauber, Jens; Davis, Adrian L V; Dawson, Jeff; de Sassi, Claudio; de Thoisy, Benoit; Deheuvels, Olivier; Dejean, Alain; Devineau, Jean-Louis; Diekötter, Tim; Dolia, Jignasu V; Domínguez, Erwin; Dominguez-Haydar, Yamileth; Dorn, Silvia; Draper, Isabel; Dreber, Niels; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Eggleton, Paul; Eigenbrod, Felix; Elek, Zoltán; Entling, Martin H; Esler, Karen J; de Lima, Ricardo F; Faruk, Aisyah; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Fensham, Roderick J; Fernandez, Ignacio C; Ferreira, Catarina C; Ficetola, Gentile F; Fiera, Cristina; Filgueiras, Bruno K C; Fırıncıoğlu, Hüseyin K; Flaspohler, David; Floren, Andreas; Fonte, Steven J; Fournier, Anne; Fowler, Robert E; Franzén, Markus; Fraser, Lauchlan H; Fredriksson, Gabriella M; Freire, Geraldo B; Frizzo, Tiago L M; Fukuda, Daisuke; Furlani, Dario; Gaigher, René; Ganzhorn, Jörg U; García, Karla P; Garcia-R, Juan C; Garden, Jenni G; Garilleti, Ricardo; Ge, Bao-Ming; Gendreau-Berthiaume, Benoit; Gerard, Philippa J; Gheler-Costa, Carla; Gilbert, Benjamin; Giordani, Paolo; Giordano, Simonetta; Golodets, Carly; Gomes, Laurens G L; Gould, Rachelle K; Goulson, Dave; Gove, Aaron D; Granjon, Laurent; Grass, Ingo; Gray, Claudia L; Grogan, James; Gu, Weibin; Guardiola, Moisès; Gunawardene, Nihara R; Gutierrez, Alvaro G; Gutiérrez-Lamus, Doris L; Haarmeyer, Daniela H; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hassan, Shombe N; Hatfield, Richard G; Hawes, Joseph E; Hayward, Matt W; Hébert, Christian; Helden, Alvin J; Henden, John-André; Henschel, Philipp; Hernández, Lionel; Herrera, James P; Herrmann, Farina; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Höfer, Hubert; Hoffmann, Anke; Horgan, Finbarr G; Hornung, Elisabeth; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishida, Hiroaki; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Hernández, F Jiménez; Johnson, McKenzie F; Jolli, Virat; Jonsell, Mats; Juliani, S Nur; Jung, Thomas S; Kapoor, Vena; Kappes, Heike; Kati, Vassiliki; Katovai, Eric; Kellner, Klaus; Kessler, Michael; Kirby, Kathryn R; Kittle, Andrew M; Knight, Mairi E; Knop, Eva; Kohler, Florian; Koivula, Matti; Kolb, Annette; Kone, Mouhamadou; Kőrösi, Ádám; Krauss, Jochen; Kumar, Ajith; Kumar, Raman; Kurz, David J; Kutt, Alex S; Lachat, Thibault; Lantschner, Victoria; Lara, Francisco; Lasky, Jesse R; Latta, Steven C; Laurance, William F; Lavelle, Patrick; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Lehouck, Valérie; Lencinas, María V; Lentini, Pia E; Letcher, Susan G; Li, Qi; Litchwark, Simon A; Littlewood, Nick A; Liu, Yunhui; Lo-Man-Hung, Nancy; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Luskin, Matthew S; MacSwiney G, M Cristina; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Malone, Louise A; Malonza, Patrick K; Malumbres-Olarte, Jagoba; Mandujano, Salvador; Måren, Inger E; Marin-Spiotta, Erika; Marsh, Charles J; Marshall, E J P; Martínez, Eliana; Martínez Pastur, Guillermo; Moreno Mateos, David; Mayfield, Margaret M; Mazimpaka, Vicente; McCarthy, Jennifer L; McCarthy, Kyle P; McFrederick, Quinn S; McNamara, Sean; Medina, Nagore G; Medina, Rafael; Mena, Jose L; Mico, Estefania; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Miranda-Esquivel, Daniel R; Moir, Melinda L; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Mudri-Stojnic, Sonja; Munira, A Nur; Muoñz-Alonso, Antonio; Munyekenye, B F; Naidoo, Robin; Naithani, A; Nakagawa, Michiko; Nakamura, Akihiro; Nakashima, Yoshihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Navarro-Iriarte, Luis; Ndang'ang'a, Paul K; Neuschulz, Eike L; Ngai, Jacqueline T; Nicolas, Violaine; Nilsson, Sven G; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Norton, David A; Nöske, Nicole M; Nowakowski, A Justin; Numa, Catherine; O'Dea, Niall; O'Farrell, Patrick J; Oduro, William; Oertli, Sabine; Ofori-Boateng, Caleb; Oke, Christopher Omamoke; Oostra, Vicencio; Osgathorpe, Lynne M; Otavo, Samuel Eduardo; Page, Navendu V; Paritsis, Juan; Parra-H, Alejandro; Parry, Luke; Pe'er, Guy; Pearman, Peter B; Pelegrin, Nicolás; Pélissier, Raphaël; Peres, Carlos A; Peri, Pablo L; Persson, Anna S; Petanidou, Theodora; Peters, Marcell K; Pethiyagoda, Rohan S; Phalan, Ben; Philips, T Keith; Pillsbury, Finn C; Pincheira-Ulbrich, Jimmy; Pineda, Eduardo; Pino, Joan; Pizarro-Araya, Jaime; Plumptre, A J; Poggio, Santiago L; Politi, Natalia; Pons, Pere; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Rader, Romina; Ramesh, B R; Ramirez-Pinilla, Martha P; Ranganathan, Jai; Rasmussen, Claus; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Rey Benayas, José M; Rey-Velasco, Juan Carlos; Reynolds, Chevonne; Ribeiro, Danilo Bandini; Richards, Miriam H; Richardson, Barbara A; Richardson, Michael J; Ríos, Rodrigo Macip; Robinson, Richard; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rös, Matthias; Rosselli, Loreta; Rossiter, Stephen J; Roth, Dana S; Roulston, T'ai H; Rousseau, Laurent; Rubio, André V; Ruel, Jean-Claude; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Sam, Katerina; Samnegård, Ulrika; Santana, Joana; Santos, Xavier; Savage, Jade; Schellhorn, Nancy A; Schilthuizen, Menno; Schmiedel, Ute; Schmitt, Christine B; Schon, Nicole L; Schüepp, Christof; Schumann, Katharina; Schweiger, Oliver; Scott, Dawn M; Scott, Kenneth A; Sedlock, Jodi L; Seefeldt, Steven S; Shahabuddin, Ghazala; Shannon, Graeme; Sheil, Douglas; Sheldon, Frederick H; Shochat, Eyal; Siebert, Stefan J; Silva, Fernando A B; Simonetti, Javier A; Slade, Eleanor M; Smith, Jo; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Soto Quiroga, Grimaldo; St-Laurent, Martin-Hugues; Starzomski, Brian M; Stefanescu, Constanti; Steffan-Dewenter, Ingolf; Stouffer, Philip C; Stout, Jane C; Strauch, Ayron M; Struebig, Matthew J; Su, Zhimin; Suarez-Rubio, Marcela; Sugiura, Shinji; Summerville, Keith S; Sung, Yik-Hei; Sutrisno, Hari; Svenning, Jens-Christian; Teder, Tiit; Threlfall, Caragh G; Tiitsaar, Anu; Todd, Jacqui H; Tonietto, Rebecca K; Torre, Ignasi; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Uehara-Prado, Marcio; Urbina-Cardona, Nicolas; Vallan, Denis; Vanbergen, Adam J; Vasconcelos, Heraldo L; Vassilev, Kiril; Verboven, Hans A F; Verdasca, Maria João; Verdú, José R; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Virgilio, Massimiliano; Vu, Lien Van; Waite, Edward M; Walker, Tony R; Wang, Hua-Feng; Wang, Yanping; Watling, James I; Weller, Britta; Wells, Konstans; Westphal, Catrin; Wiafe, Edward D; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Wolters, Volkmar; Woodcock, Ben A; Wu, Jihua; Wunderle, Joseph M; Yamaura, Yuichi; Yoshikura, Satoko; Yu, Douglas W; Zaitsev, Andrey S; Zeidler, Juliane; Zou, Fasheng; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of

  2. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    DEFF Research Database (Denmark)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity ...

  3. PREDICTS: Projecting Responses of Ecological Diversity in Changing Terrestrial Systems

    Directory of Open Access Journals (Sweden)

    Georgina Mace

    2012-12-01

    Full Text Available The PREDICTS project (www.predicts.org.uk is a three-year NERC-funded project to model and predict at a global scale how local terrestrial diversity responds to human pressures such as land use, land cover, pollution, invasive species and infrastructure. PREDICTS is a collaboration between Imperial College London, the UNEP World Conservation Monitoring Centre, Microsoft Research Cambridge, UCL and the University of Sussex. In order to meet its aims, the project relies on extensive data describing the diversity and composition of biological communities at a local scale. Such data are collected on a vast scale through the committed efforts of field ecologists. If you have appropriate data that you would be willing to share with us, please get in touch (enquiries@predicts.org.uk. All contributions will be acknowledged appropriately and all data contributors will be included as co-authors on an open-access paper describing the database.

  4. Wind Plant Performance Prediction (WP3) Project

    Energy Technology Data Exchange (ETDEWEB)

    Craig, Anna [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-26

    The methods for analysis of operational wind plant data are highly variable across the wind industry, leading to high uncertainties in the validation and bias-correction of preconstruction energy estimation methods. Lack of credibility in the preconstruction energy estimates leads to significant impacts on project financing and therefore the final levelized cost of energy for the plant. In this work, the variation in the evaluation of a wind plant's operational energy production as a result of variations in the processing methods applied to the operational data is examined. Preliminary results indicate that selection of the filters applied to the data and the filter parameters can have significant impacts in the final computed assessment metrics.

  5. A Compact Methodology to Understand, Evaluate, and Predict the Performance of Automatic Target Recognition

    Science.gov (United States)

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Chen, Yiping; Zhuang, Zhaowen; Cheng, Yongqiang; Deng, Bin; Wang, Liandong; Zeng, Yonghu; Gao, Lei

    2014-01-01

    This paper offers a compacted mechanism to carry out the performance evaluation work for an automatic target recognition (ATR) system: (a) a standard description of the ATR system's output is suggested, a quantity to indicate the operating condition is presented based on the principle of feature extraction in pattern recognition, and a series of indexes to assess the output in different aspects are developed with the application of statistics; (b) performance of the ATR system is interpreted by a quality factor based on knowledge of engineering mathematics; (c) through a novel utility called “context-probability” estimation proposed based on probability, performance prediction for an ATR system is realized. The simulation result shows that the performance of an ATR system can be accounted for and forecasted by the above-mentioned measures. Compared to existing technologies, the novel method can offer more objective performance conclusions for an ATR system. These conclusions may be helpful in knowing the practical capability of the tested ATR system. At the same time, the generalization performance of the proposed method is good. PMID:24967605

  6. Predicting shrinkage and warpage in injection molding: Towards automatized mold design

    Science.gov (United States)

    Zwicke, Florian; Behr, Marek; Elgeti, Stefanie

    2017-10-01

    It is an inevitable part of any plastics molding process that the material undergoes some shrinkage during solidification. Mainly due to unavoidable inhomogeneities in the cooling process, the overall shrinkage cannot be assumed as homogeneous in all volumetric directions. The direct consequence is warpage. The accurate prediction of such shrinkage and warpage effects has been the subject of a considerable amount of research, but it is important to note that this behavior depends greatly on the type of material that is used as well as the process details. Without limiting ourselves to any specific properties of certain materials or process designs, we aim to develop a method for the automatized design of a mold cavity that will produce correctly shaped moldings after solidification. Essentially, this can be stated as a shape optimization problem, where the cavity shape is optimized to fulfill some objective function that measures defects in the molding shape. In order to be able to develop and evaluate such a method, we first require simulation methods for the diffierent steps involved in the injection molding process that can represent the phenomena responsible for shrinkage and warpage ina sufficiently accurate manner. As a starting point, we consider the solidification of purely amorphous materials. In this case, the material slowly transitions from fluid-like to solid-like behavior as it cools down. This behavior is modeled using adjusted viscoelastic material models. Once the material has passed a certain temperature threshold during cooling, any viscous effects are neglected and the behavior is assumed to be fully elastic. Non-linear elastic laws are used to predict shrinkage and warpage that occur after this point. We will present the current state of these simulation methods and show some first approaches towards optimizing the mold cavity shape based on these methods.

  7. Geospatial application of the Water Erosion Prediction Project (WEPP) Model

    Science.gov (United States)

    D. C. Flanagan; J. R. Frankenberger; T. A. Cochrane; C. S. Renschler; W. J. Elliot

    2011-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based technology for prediction of soil erosion by water at hillslope profile, field, and small watershed scales. In particular, WEPP utilizes observed or generated daily climate inputs to drive the surface hydrology processes (infiltration, runoff, ET) component, which subsequently impacts the rest of the...

  8. Machine learning in updating predictive models of planning and scheduling transportation projects

    Science.gov (United States)

    1997-01-01

    A method combining machine learning and regression analysis to automatically and intelligently update predictive models used in the Kansas Department of Transportations (KDOTs) internal management system is presented. The predictive models used...

  9. Predicting automatic speech recognition performance over communication channels from instrumental speech quality and intelligibility scores

    NARCIS (Netherlands)

    Gallardo, L.F.; Möller, S.; Beerends, J.

    2017-01-01

    The performance of automatic speech recognition based on coded-decoded speech heavily depends on the quality of the transmitted signals, determined by channel impairments. This paper examines relationships between speech recognition performance and measurements of speech quality and intelligibility

  10. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  11. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    OpenAIRE

    Hudson, L. N.; Newbold, T.; Contu, S.; Hill, S. L.; Lysenko, I.; De Palma, A.; Phillips, H. R.; Alhusseini, T. I.; Bedford, F. E.; Bennett, D. J.; Booth, H.; Burton, V. J.; Chng, C. W.; Choimes, A.; Correia, D. L.

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make free...

  12. Better Metrics to Automatically Predict the Quality of a Text Summary

    Directory of Open Access Journals (Sweden)

    Judith D. Schlesinger

    2012-09-01

    Full Text Available In this paper we demonstrate a family of metrics for estimating the quality of a text summary relative to one or more human-generated summaries. The improved metrics are based on features automatically computed from the summaries to measure content and linguistic quality. The features are combined using one of three methods—robust regression, non-negative least squares, or canonical correlation, an eigenvalue method. The new metrics significantly outperform the previous standard for automatic text summarization evaluation, ROUGE.

  13. Geospatial application of the Water Erosion Prediction Project (WEPP) model

    Science.gov (United States)

    D. C. Flanagan; J. R. Frankenberger; T. A. Cochrane; C. S. Renschler; W. J. Elliot

    2013-01-01

    At the hillslope profile and/or field scale, a simple Windows graphical user interface (GUI) is available to easily specify the slope, soil, and management inputs for application of the USDA Water Erosion Prediction Project (WEPP) model. Likewise, basic small watershed configurations of a few hillslopes and channels can be created and simulated with this GUI. However,...

  14. The Masculinity of Money: Automatic Stereotypes Predict Gender Differences in Estimated Salaries

    Science.gov (United States)

    Williams, Melissa J.; Paluck, Elizabeth Levy; Spencer-Rodgers, Julie

    2010-01-01

    We present the first empirical investigation of why men are assumed to earn higher salaries than women (the "salary estimation effect"). Although this phenomenon is typically attributed to conscious consideration of the national wage gap (i.e., real inequities in salary), we hypothesize instead that it reflects differential, automatic economic…

  15. Development project of an automatic sampling system for part time unmanned pipeline terminals

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Gullherme O.; De Almelda, Marcio M. G.; Ramos, Ricardo R. [Petrobas, (Brazil); Potten, Gary [Cameron Measurement Systems, (United States)

    2010-07-01

    The Sao Paulo - Brasilia Pipeline (OSBRA) is a highly automated pipeline using a SCADA system which operates from a control room. A new quality management system standard was established for transportation and storage operations. The products had to be sampled on an automatic basis. This paper reports the development of an automatic sampling system (ASS) in accordance with the new quality control standard. The prototype was developed to be implemented through a human-machine interface (HMI) from the control room SCADA screens. A technical cooperation agreement(TCA) was drawn up for development of this new ASS product. The TCA was a joint cooperation between the Holding, the Operator and the cooperators. The prototype will be on-field tested at Senador Canedo tank farm to SPEC requirements. The current performance of the ASS establishes reasonable expectations for further successful development.

  16. SU-F-T-342: Dosimetric Constraint Prediction Guided Automatic Mulit-Objective Optimization for Intensity Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Song, T; Zhou, L; Li, Y

    2016-01-01

    Purpose: For intensity modulated radiotherapy, the plan optimization is time consuming with difficulties of selecting objectives and constraints, and their relative weights. A fast and automatic multi-objective optimization algorithm with abilities to predict optimal constraints and manager their trade-offs can help to solve this problem. Our purpose is to develop such a framework and algorithm for a general inverse planning. Methods: There are three main components contained in this proposed multi-objective optimization framework: prediction of initial dosimetric constraints, further adjustment of constraints and plan optimization. We firstly use our previously developed in-house geometry-dosimetry correlation model to predict the optimal patient-specific dosimetric endpoints, and treat them as initial dosimetric constraints. Secondly, we build an endpoint(organ) priority list and a constraint adjustment rule to repeatedly tune these constraints from their initial values, until every single endpoint has no room for further improvement. Lastly, we implement a voxel-independent based FMO algorithm for optimization. During the optimization, a model for tuning these voxel weighting factors respecting to constraints is created. For framework and algorithm evaluation, we randomly selected 20 IMRT prostate cases from the clinic and compared them with our automatic generated plans, in both the efficiency and plan quality. Results: For each evaluated plan, the proposed multi-objective framework could run fluently and automatically. The voxel weighting factor iteration time varied from 10 to 30 under an updated constraint, and the constraint tuning time varied from 20 to 30 for every case until no more stricter constraint is allowed. The average total costing time for the whole optimization procedure is ∼30mins. By comparing the DVHs, better OAR dose sparing could be observed in automatic generated plan, for 13 out of the 20 cases, while others are with competitive

  17. SU-F-T-342: Dosimetric Constraint Prediction Guided Automatic Mulit-Objective Optimization for Intensity Modulated Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Song, T; Zhou, L [Southern Medical University, Guangzhou, Guangdong (China); Li, Y [Beihang University, Beijing, Beijing (China)

    2016-06-15

    Purpose: For intensity modulated radiotherapy, the plan optimization is time consuming with difficulties of selecting objectives and constraints, and their relative weights. A fast and automatic multi-objective optimization algorithm with abilities to predict optimal constraints and manager their trade-offs can help to solve this problem. Our purpose is to develop such a framework and algorithm for a general inverse planning. Methods: There are three main components contained in this proposed multi-objective optimization framework: prediction of initial dosimetric constraints, further adjustment of constraints and plan optimization. We firstly use our previously developed in-house geometry-dosimetry correlation model to predict the optimal patient-specific dosimetric endpoints, and treat them as initial dosimetric constraints. Secondly, we build an endpoint(organ) priority list and a constraint adjustment rule to repeatedly tune these constraints from their initial values, until every single endpoint has no room for further improvement. Lastly, we implement a voxel-independent based FMO algorithm for optimization. During the optimization, a model for tuning these voxel weighting factors respecting to constraints is created. For framework and algorithm evaluation, we randomly selected 20 IMRT prostate cases from the clinic and compared them with our automatic generated plans, in both the efficiency and plan quality. Results: For each evaluated plan, the proposed multi-objective framework could run fluently and automatically. The voxel weighting factor iteration time varied from 10 to 30 under an updated constraint, and the constraint tuning time varied from 20 to 30 for every case until no more stricter constraint is allowed. The average total costing time for the whole optimization procedure is ∼30mins. By comparing the DVHs, better OAR dose sparing could be observed in automatic generated plan, for 13 out of the 20 cases, while others are with competitive

  18. Face Prediction Model for an Automatic Age-invariant Face Recognition System

    OpenAIRE

    Yadav, Poonam

    2015-01-01

    07.11.14 KB. Emailed author re copyright. Author says that copyright is retained by author. Ok to add to spiral Automated face recognition and identi cation softwares are becoming part of our daily life; it nds its abode not only with Facebooks auto photo tagging, Apples iPhoto, Googles Picasa, Microsofts Kinect, but also in Homeland Security Departments dedicated biometric face detection systems. Most of these automatic face identification systems fail where the e ects of aging come into...

  19. Usefulness of semi-automatic volumetry compared to established linear measurements in predicting lymph node metastases in MSCT

    Energy Technology Data Exchange (ETDEWEB)

    Buerke, Boris; Puesken, Michael; Heindel, Walter; Wessling, Johannes (Dept. of Clinical Radiology, Univ. of Muenster (Germany)), email: buerkeb@uni-muenster.de; Gerss, Joachim (Dept. of Medical Informatics and Biomathematics, Univ. of Muenster (Germany)); Weckesser, Matthias (Dept. of Nuclear Medicine, Univ. of Muenster (Germany))

    2011-06-15

    Background Volumetry of lymph nodes potentially better reflect asymmetric size alterations independently of lymph node orientation in comparison to metric parameters (e.g. long-axis diameter). Purpose To distinguish between benign and malignant lymph nodes by comparing 2D and semi-automatic 3D measurements in MSCT. Material and Methods FDG-18 PET-CT was performed in 33 patients prior to therapy for malignant melanoma at stage III/IV. One hundred and eighty-six cervico-axillary, abdominal and inguinal lymph nodes were evaluated independently by two radiologists, both manually and with the use of semi-automatic segmentation software. Long axis (LAD), short axis (SAD), maximal 3D diameter, volume and elongation were obtained. PET-CT, PET-CT follow-up and/or histology served as a combined reference standard. Statistics encompassed intra-class correlation coefficients and ROC curves. Results Compared to manual assessment, semi-automatic inter-observer variability was found to be lower, e.g. at 2.4% (95% CI 0.05-4.8) for LAD. The standard of reference revealed metastases in 90 (48%) of 186 lymph nodes. Semi-automatic prediction of lymph node metastases revealed highest areas under the ROC curves for volume (reader 1 0.77, 95%CI 0.64-0.90; reader 2 0.76, 95%CI 0.59-0.86) and SAD (reader 1 0.76, 95%CI 0.64-0.88; reader 2 0.75, 95%CI 0.62-0.89). The findings for LAD (reader 1 0.73, 95%CI 0.60-0.86; reader 2 0.71, 95%CI 0.71, 95%CI 0.57-0.85) and maximal 3D diameter (reader 1 0.70, 95%CI 0.53-0.86; reader 2 0.76, 95%CI 0.50-0.80) were found substantially lower and for elongation (reader 1 0.65, 95%CI 0.50-0.79; reader 2 0.66, 95%CI 0.52-0.81) significantly lower (p < 0.05). Conclusion Semi-automatic analysis of lymph nodes in malignant melanoma is supported by high segmentation quality and reproducibility. As compared to established SAD, semi-automatic lymph node volumetry does not have an additive role for categorizing lymph nodes as normal or metastatic in malignant

  20. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  1. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    OpenAIRE

    Hudson, LN; Newbold, T; Contu, S; Hill, SLL; Lysenko, I; De Palma, A; Phillips, HRP; Alhusseini, TI; Bedford, FE; Bennett, DJ; Booth, H; Burton, VJ; Chng, CWT; Choimes, A; Correia, DLP

    2017-01-01

    The PREDICTS project—Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)—has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make free...

  2. Automatic selection of reference taxa for protein-protein interaction prediction with phylogenetic profiling

    DEFF Research Database (Denmark)

    Simonsen, Martin; Maetschke, S.R.; Ragan, M.A.

    2012-01-01

    Motivation: Phylogenetic profiling methods can achieve good accuracy in predicting protein–protein interactions, especially in prokaryotes. Recent studies have shown that the choice of reference taxa (RT) is critical for accurate prediction, but with more than 2500 fully sequenced taxa publicly......: We present three novel methods for automating the selection of RT, using machine learning based on known protein–protein interaction networks. One of these methods in particular, Tree-Based Search, yields greatly improved prediction accuracies. We further show that different methods for constituting...... phylogenetic profiles often require very different RT sets to support high prediction accuracy....

  3. Automaticity and localisation of concurrents predicts colour area activity in grapheme-colour synaesthesia.

    Science.gov (United States)

    Gould van Praag, Cassandra D; Garfinkel, Sarah; Ward, Jamie; Bor, Daniel; Seth, Anil K

    2016-07-29

    In grapheme-colour synaesthesia (GCS), the presentation of letters or numbers induces an additional 'concurrent' experience of colour. Early functional MRI (fMRI) investigations of GCS reported activation in colour-selective area V4 during the concurrent experience. However, others have failed to replicate this key finding. We reasoned that individual differences in synaesthetic phenomenology might explain this inconsistency in the literature. To test this hypothesis, we examined fMRI BOLD responses in a group of grapheme-colour synaesthetes (n=20) and matched controls (n=20) while characterising the individual phenomenology of the synaesthetes along dimensions of 'automaticity' and 'localisation'. We used an independent functional localiser to identify colour-selective areas in both groups. Activations in these areas were then assessed during achromatic synaesthesia-inducing, and non-inducing conditions; we also explored whole brain activations, where we sought to replicate the existing literature regarding synaesthesia effects. Controls showed no significant activations in the contrast of inducing > non-inducing synaesthetic stimuli, in colour-selective ROIs or at the whole brain level. In the synaesthete group, we correlated activation within colour-selective ROIs with individual differences in phenomenology using the Coloured Letters and Numbers (CLaN) questionnaire which measures, amongst other attributes, the subjective automaticity/attention in synaesthetic concurrents, and their spatial localisation. Supporting our hypothesis, we found significant correlations between individual measures of synaesthetic phenomenology and BOLD responses in colour-selective areas, when contrasting inducing against non-inducing stimuli. Specifically, left-hemisphere colour area responses were stronger for synaesthetes scoring high on phenomenological localisation and automaticity/attention, while right-hemisphere colour area responses showed a relationship with localisation

  4. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units

    International Nuclear Information System (INIS)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios, Martha; Baechler, Sébastien

    2015-01-01

    Purpose: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. Method: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). Results: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Conclusion: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables

  5. Automatic prediction of rheumatoid arthritis disease activity from the electronic medical records.

    Directory of Open Access Journals (Sweden)

    Chen Lin

    Full Text Available We aimed to mine the data in the Electronic Medical Record to automatically discover patients' Rheumatoid Arthritis disease activity at discrete rheumatology clinic visits. We cast the problem as a document classification task where the feature space includes concepts from the clinical narrative and lab values as stored in the Electronic Medical Record.The Training Set consisted of 2792 clinical notes and associated lab values. Test Set 1 included 1749 clinical notes and associated lab values. Test Set 2 included 344 clinical notes for which there were no associated lab values. The Apache clinical Text Analysis and Knowledge Extraction System was used to analyze the text and transform it into informative features to be combined with relevant lab values.Experiments over a range of machine learning algorithms and features were conducted. The best performing combination was linear kernel Support Vector Machines with Unified Medical Language System Concept Unique Identifier features with feature selection and lab values. The Area Under the Receiver Operating Characteristic Curve (AUC is 0.831 (σ = 0.0317, statistically significant as compared to two baselines (AUC = 0.758, σ = 0.0291. Algorithms demonstrated superior performance on cases clinically defined as extreme categories of disease activity (Remission and High compared to those defined as intermediate categories (Moderate and Low and included laboratory data on inflammatory markers.Automatic Rheumatoid Arthritis disease activity discovery from Electronic Medical Record data is a learnable task approximating human performance. As a result, this approach might have several research applications, such as the identification of patients for genome-wide pharmacogenetic studies that require large sample sizes with precise definitions of disease activity and response to therapies.

  6. Fully automatic segmentation of arbitrarily shaped fiducial markers in cone-beam CT projections

    DEFF Research Database (Denmark)

    Bertholet, Jenny; Wan, Hanlin; Toftegaard, Jakob

    2017-01-01

    segmentation, the DPTB algorithm generates and uses a 3D marker model to create 2D templates at any projection angle. The 2D templates are used to segment the marker position as the position with highest normalized cross-correlation in a search area centered at the DP segmented position. The accuracy of the DP...... algorithm and the new DPTB algorithm was quantified as the 2D segmentation error (pixels) compared to a manual ground truth segmentation for 97 markers in the projection images of CBCT scans of 40 patients. Also the fraction of wrong segmentations, defined as 2D errors larger than 5 pixels, was calculated...

  7. Effect of Localizer Radiography Projection on Organ Dose at Chest CT with Automatic Tube Current Modulation.

    Science.gov (United States)

    Saltybaeva, Natalia; Krauss, Andreas; Alkadhi, Hatem

    2017-03-01

    Purpose To calculate the effect of localizer radiography projections to the total radiation dose, including both the dose from localizer radiography and that from subsequent chest computed tomography (CT) with tube current modulation (TCM). Materials and Methods An anthropomorphic phantom was scanned with 192-section CT without and with differently sized breast attachments. Chest CT with TCM was performed after one localizer radiographic examination with anteroposterior (AP) or posteroanterior (PA) projections. Dose distributions were obtained by means of Monte Carlo simulations based on acquired CT data. For Monte Carlo simulations of localizer radiography, the tube position was fixed at 0° and 180°; for chest CT, a spiral trajectory with TCM was used. The effect of tube start angles on dose distribution was investigated with Monte Carlo simulations by using TCM curves with fixed start angles (0°, 90°, and 180°). Total doses for lungs, heart, and breast were calculated as the sum of the dose from localizer radiography and CT. Image noise was defined as the standard deviation of attenuation measured in 14 circular regions of interest. The Wilcoxon signed rank test, paired t test, and Friedman analysis of variance were conducted to evaluate differences in noise, TCM curves, and organ doses, respectively. Results Organ doses from localizer radiography were lower when using a PA instead of an AP projection (P = .005). The use of a PA projection resulted in higher TCM values for chest CT (P chest CT. © RSNA, 2016 Online supplemental material is available for this article.

  8. Autonomous monitoring of control hardware to predict off-normal conditions using NIF automatic alignment systems

    International Nuclear Information System (INIS)

    Awwal, Abdul A.S.; Wilhelmsen, Karl; Leach, Richard R.; Miller-Kamm, Vicki; Burkhart, Scott; Lowe-Webb, Roger; Cohen, Simon

    2012-01-01

    Highlights: ► An automatic alignment system was developed to process images of the laser beams. ► System uses processing to adjust a series of control loops until alignment criteria are satisfied. ► Monitored conditions are compared against nominal values with an off-normal alert. ► Automated health monitoring system trends off-normals with a large image history. - Abstract: The National Ignition Facility (NIF) is a high power laser system capable of supporting high-energy-density experimentation as a user facility for the next 30 years. In order to maximize the facility availability, preventive maintenance enhancements are being introduced into the system. An example of such an enhancement is a camera-based health monitoring system, integrated into the automated alignment system, which provides an opportunity to monitor trends in measurements such as average beam intensity, size of the beam, and pixel saturation. The monitoring system will generate alerts based on observed trends in measurements to allow scheduled pro-active maintenance before routine off-normal detection stops system operations requiring unscheduled intervention.

  9. Autonomous monitoring of control hardware to predict off-normal conditions using NIF automatic alignment systems

    Energy Technology Data Exchange (ETDEWEB)

    Awwal, Abdul A.S., E-mail: awwal1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States); Wilhelmsen, Karl; Leach, Richard R.; Miller-Kamm, Vicki; Burkhart, Scott; Lowe-Webb, Roger; Cohen, Simon [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer An automatic alignment system was developed to process images of the laser beams. Black-Right-Pointing-Pointer System uses processing to adjust a series of control loops until alignment criteria are satisfied. Black-Right-Pointing-Pointer Monitored conditions are compared against nominal values with an off-normal alert. Black-Right-Pointing-Pointer Automated health monitoring system trends off-normals with a large image history. - Abstract: The National Ignition Facility (NIF) is a high power laser system capable of supporting high-energy-density experimentation as a user facility for the next 30 years. In order to maximize the facility availability, preventive maintenance enhancements are being introduced into the system. An example of such an enhancement is a camera-based health monitoring system, integrated into the automated alignment system, which provides an opportunity to monitor trends in measurements such as average beam intensity, size of the beam, and pixel saturation. The monitoring system will generate alerts based on observed trends in measurements to allow scheduled pro-active maintenance before routine off-normal detection stops system operations requiring unscheduled intervention.

  10. Automatic single- and multi-label enzymatic function prediction by machine learning

    Directory of Open Access Journals (Sweden)

    Shervine Amidi

    2017-03-01

    Full Text Available The number of protein structures in the PDB database has been increasing more than 15-fold since 1999. The creation of computational models predicting enzymatic function is of major importance since such models provide the means to better understand the behavior of newly discovered enzymes when catalyzing chemical reactions. Until now, single-label classification has been widely performed for predicting enzymatic function limiting the application to enzymes performing unique reactions and introducing errors when multi-functional enzymes are examined. Indeed, some enzymes may be performing different reactions and can hence be directly associated with multiple enzymatic functions. In the present work, we propose a multi-label enzymatic function classification scheme that combines structural and amino acid sequence information. We investigate two fusion approaches (in the feature level and decision level and assess the methodology for general enzymatic function prediction indicated by the first digit of the enzyme commission (EC code (six main classes on 40,034 enzymes from the PDB database. The proposed single-label and multi-label models predict correctly the actual functional activities in 97.8% and 95.5% (based on Hamming-loss of the cases, respectively. Also the multi-label model predicts all possible enzymatic reactions in 85.4% of the multi-labeled enzymes when the number of reactions is unknown. Code and datasets are available at https://figshare.com/s/a63e0bafa9b71fc7cbd7.

  11. Evaluating a variety of text-mined features for automatic protein function prediction with GOstruct.

    Science.gov (United States)

    Funk, Christopher S; Kahanda, Indika; Ben-Hur, Asa; Verspoor, Karin M

    2015-01-01

    Most computational methods that predict protein function do not take advantage of the large amount of information contained in the biomedical literature. In this work we evaluate both ontology term co-mention and bag-of-words features mined from the biomedical literature and analyze their impact in the context of a structured output support vector machine model, GOstruct. We find that even simple literature based features are useful for predicting human protein function (F-max: Molecular Function =0.408, Biological Process =0.461, Cellular Component =0.608). One advantage of using literature features is their ability to offer easy verification of automated predictions. We find through manual inspection of misclassifications that some false positive predictions could be biologically valid predictions based upon support extracted from the literature. Additionally, we present a "medium-throughput" pipeline that was used to annotate a large subset of co-mentions; we suggest that this strategy could help to speed up the rate at which proteins are curated.

  12. From Daguerreotypes to Digital Automatic Photogrammetry. Applications and Limits for the Built Heritage Project

    Science.gov (United States)

    Fassi, F.; Campanella, C.

    2017-05-01

    This paper will describe the evolutionary stages that shaped and built, over the time, a robust and solid relationship between `indirect survey methods' and knowledge of the `architectural matter', aiming at producing a conservation project for the built heritage. Collecting architectural data by simply drawing them was considered to be inadequate by John Ruskin already in 1845. He strongly felt the need to fix them through that `blessed' invention that was the `daguerreotype'. Today taking simple photographs is not enough: it is crucial to develop systems able to provide the best graphics supports (possibly in the third dimension) for the development and editing of the architectural project. This paper will focus not only on the re-examination of historical data, on the research and representation of the `sign', but also on the evolution of technologies and `reading methods', in order to highlight their strengths and weaknesses in the real practice of conservation project and in the use of the architectures of the past.

  13. Macroweather Predictions and Climate Projections using Scaling and Historical Observations

    Science.gov (United States)

    Hébert, R.; Lovejoy, S.; Del Rio Amador, L.

    2017-12-01

    There are two fundamental time scales that are pertinent to decadal forecasts and multidecadal projections. The first is the lifetime of planetary scale structures, about 10 days (equal to the deterministic predictability limit), and the second is - in the anthropocene - the scale at which the forced anthropogenic variability exceeds the internal variability (around 16 - 18 years). These two time scales define three regimes of variability: weather, macroweather and climate that are respectively characterized by increasing, decreasing and then increasing varibility with scale.We discuss how macroweather temperature variability can be skilfully predicted to its theoretical stochastic predictability limits by exploiting its long-range memory with the Stochastic Seasonal and Interannual Prediction System (StocSIPS). At multi-decadal timescales, the temperature response to forcing is approximately linear and this can be exploited to make projections with a Green's function, or Climate Response Function (CRF). To make the problem tractable, we exploit the temporal scaling symmetry and restrict our attention to global mean forcing and temperature response using a scaling CRF characterized by the scaling exponent H and an inner scale of linearity τ. An aerosol linear scaling factor α and a non-linear volcanic damping exponent ν were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference using historical data and these allow us to analytically calculate a median (and likely 66% range) for the transient climate response, and for the equilibrium climate sensitivity: 1.6K ([1.5,1.8]K) and 2.4K ([1.9,3.4]K) respectively. Aerosol forcing typically has large uncertainty and we find a modern (2005) forcing very likely range (90%) of [-1.0, -0.3] Wm-2 with median at -0.7 Wm-2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to Representative

  14. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  15. NERI PROJECT 99-119. TASK 2. DATA-DRIVEN PREDICTION OF PROCESS VARIABLES. FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.

    2003-04-10

    This report describes the detailed results for task 2 of DOE-NERI project number 99-119 entitled ''Automatic Development of Highly Reliable Control Architecture for Future Nuclear Power Plants''. This project is a collaboration effort between the Oak Ridge National Laboratory (ORNL,) The University of Tennessee, Knoxville (UTK) and the North Carolina State University (NCSU). UTK is the lead organization for Task 2 under contract number DE-FG03-99SF21906. Under task 2 we completed the development of data-driven models for the characterization of sub-system dynamics for predicting state variables, control functions, and expected control actions. We have also developed the ''Principal Component Analysis (PCA)'' approach for mapping system measurements, and a nonlinear system modeling approach called the ''Group Method of Data Handling (GMDH)'' with rational functions, and includes temporal data information for transient characterization. The majority of the results are presented in detailed reports for Phases 1 through 3 of our research, which are attached to this report.

  16. Automatic tracking of implanted fiducial markers in cone beam CT projection images

    International Nuclear Information System (INIS)

    Marchant, T. E.; Skalski, A.; Matuszewski, B. J.

    2012-01-01

    Purpose: This paper describes a novel method for simultaneous intrafraction tracking of multiple fiducial markers. Although the proposed method is generic and can be adopted for a number of applications including fluoroscopy based patient position monitoring and gated radiotherapy, the tracking results presented in this paper are specific to tracking fiducial markers in a sequence of cone beam CT projection images. Methods: The proposed method is accurate and robust thanks to utilizing the mean shift and random sampling principles, respectively. The performance of the proposed method was evaluated with qualitative and quantitative methods, using data from two pancreatic and one prostate cancer patients and a moving phantom. The ground truth, for quantitative evaluation, was calculated based on manual tracking preformed by three observers. Results: The average dispersion of marker position error calculated from the tracking results for pancreas data (six markers tracked over 640 frames, 3840 marker identifications) was 0.25 mm (at iscoenter), compared with an average dispersion for the manual ground truth estimated at 0.22 mm. For prostate data (three markers tracked over 366 frames, 1098 marker identifications), the average error was 0.34 mm. The estimated tracking error in the pancreas data was < 1 mm (2 pixels) in 97.6% of cases where nearby image clutter was detected and in 100.0% of cases with no nearby image clutter. Conclusions: The proposed method has accuracy comparable to that of manual tracking and, in combination with the proposed batch postprocessing, superior robustness. Marker tracking in cone beam CT (CBCT) projections is useful for a variety of purposes, such as providing data for assessment of intrafraction motion, target tracking during rotational treatment delivery, motion correction of CBCT, and phase sorting for 4D CBCT.

  17. Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech

    Directory of Open Access Journals (Sweden)

    Philip A. Huebner

    2018-02-01

    Full Text Available Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary “deep learning” approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory to predict word sequences in a 5-million-word corpus of speech directed to children ages 0–3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing

  18. Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech

    Science.gov (United States)

    Huebner, Philip A.; Willits, Jon A.

    2018-01-01

    Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary “deep learning” approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory) to predict word sequences in a 5-million-word corpus of speech directed to children ages 0–3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM) and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing semantic system. PMID

  19. Predicting new-onset of postoperative atrial fibrillation in patients undergoing cardiac surgery using semi-automatic reading of perioperative electrocardiograms

    DEFF Research Database (Denmark)

    Gu, Jiwei; Graff, Claus; Melgaard, Jacob

    2015-01-01

    P10 Predicting new-onset of postoperative atrial fibrillation in patients undergoingcardiac surgery using semi-automatic reading of perioperative electrocardiograms. Jiwei Gu, Claus Graff, Jacob Melgaard, Søren Lundbye-Christensen, Erik Berg Schmidt, Christian Torp-Pedersen, Kristinn Thorsteinsson......, Jan Jesper Andreasen. Aalborg, DenmarkBackground: Postoperative new onset atrial fibrillation (POAF) is the most common arrhythmia after cardiac surgery. The aim of this study was to evaluate if semi-automatic readings of perioperative electrocardiograms (ECGs) is of any value in predicting POAF after...... ECG monitoring. A semi-automatic machine capable of reading differentparameters of digitalized ECG’s was used to read both lead specific (P/QRS/T amplitudes/intervals) and global measurements (P-duration/QRS-duration/PR-interval/QT/Heart Rate/hypertrophy).Results: We divided the patients into two...

  20. The MicroActive project: automatic detection of disease-related molecular cell activity

    Science.gov (United States)

    Furuberg, Liv; Mielnik, Michal; Johansen, Ib-Rune; Voitel, Jörg; Gulliksen, Anja; Solli, Lars; Karlsen, Frank; Bayer, Tobias; Schönfeld, Friedhelm; Drese, Klaus; Keegan, Helen; Martin, Cara; O'Leary, John; Riegger, Lutz; Koltay, Peter

    2007-05-01

    The aim of the MicroActive project is to develop an instrument for molecular diagnostics. The instrument will first be tested for patient screening for a group of viruses causing cervical cancer. Two disposable polymer chips with reagents stored on-chip will be inserted into the instrument for each patient sample. The first chip performs sample preparation of the epithelial cervical cells while mRNA amplification and fluorescent detection takes place in the second chip. More than 10 different virus markers will be analysed in one chip. We report results on sub-functions of the amplification chip. The sample is split into smaller droplets, and the droplets move in parallel channels containing different dried reagents for the different analyses. We report experimental results on parallel droplet movement control using one external pump only, combined with hydrophobic valves. Valve burst pressures are controlled by geometry. We show droplet control using valves with burst pressures between 800 and 4500 Pa. We also monitored the re-hydration times for two necessary dried reagents. After sample insertion, uniform concentration of the reagents in the droplet was reached after respectively 60 s and 10 min. These times are acceptable for successful amplification. Finally we have shown positive amplification of HPV type 16 using dried enzymes stored in micro chambers.

  1. Automatic Power Control for Daily Load-following Operation using Model Predictive Control Method

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Keuk Jong; Kim, Han Gon [KH, Daejeon (Korea, Republic of)

    2009-10-15

    Under the circumstances that nuclear power occupies more than 50%, nuclear power plants are required to be operated on load-following operation in order to make the effective management of electric grid system and enhanced responsiveness to rapid changes in power demand. Conventional reactors such as the OPR1000 and APR1400 have a regulating system that controls the average temperature of the reactor core relation to the reference temperature. This conventional method has the advantages of proven technology and ease of implementation. However, this method is unsuitable for controlling the axial power shape, particularly the load following operation. Accordingly, this paper reports on the development of a model predictive control method which is able to control the reactor power and the axial shape index. The purpose of this study is to analyze the behavior of nuclear reactor power and the axial power shape by using a model predictive control method when the power is increased and decreased for a daily load following operation. The study confirms that deviations in the axial shape index (ASI) are within the operating limit.

  2. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project.

    Science.gov (United States)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Alhusseini, Tamera I; Bedford, Felicity E; Bennett, Dominic J; Booth, Hollie; Burton, Victoria J; Chng, Charlotte W T; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Emerson, Susan R; Gao, Di; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; Pask-Hale, Gwilym D; Pynegar, Edwin L; Robinson, Alexandra N; Sanchez-Ortiz, Katia; Senior, Rebecca A; Simmons, Benno I; White, Hannah J; Zhang, Hanbin; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Albertos, Belén; Alcala, E L; Del Mar Alguacil, Maria; Alignier, Audrey; Ancrenaz, Marc; Andersen, Alan N; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Arroyo-Rodríguez, Víctor; Aumann, Tom; Axmacher, Jan C; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Bakayoko, Adama; Báldi, András; Banks, John E; Baral, Sharad K; Barlow, Jos; Barratt, Barbara I P; Barrico, Lurdes; Bartolommei, Paola; Barton, Diane M; Basset, Yves; Batáry, Péter; Bates, Adam J; Baur, Bruno; Bayne, Erin M; Beja, Pedro; Benedick, Suzan; Berg, Åke; Bernard, Henry; Berry, Nicholas J; Bhatt, Dinesh; Bicknell, Jake E; Bihn, Jochen H; Blake, Robin J; Bobo, Kadiri S; Bóçon, Roberto; Boekhout, Teun; Böhning-Gaese, Katrin; Bonham, Kevin J; Borges, Paulo A V; Borges, Sérgio H; Boutin, Céline; Bouyer, Jérémy; Bragagnolo, Cibele; Brandt, Jodi S; Brearley, Francis Q; Brito, Isabel; Bros, Vicenç; Brunet, Jörg; Buczkowski, Grzegorz; Buddle, Christopher M; Bugter, Rob; Buscardo, Erika; Buse, Jörn; Cabra-García, Jimmy; Cáceres, Nilton C; Cagle, Nicolette L; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Caparrós, Rut; Cardoso, Pedro; Carpenter, Dan; Carrijo, Tiago F; Carvalho, Anelena L; Cassano, Camila R; Castro, Helena; Castro-Luna, Alejandro A; Rolando, Cerda B; Cerezo, Alexis; Chapman, Kim Alan; Chauvat, Matthieu; Christensen, Morten; Clarke, Francis M; Cleary, Daniel F R; Colombo, Giorgio; Connop, Stuart P; Craig, Michael D; Cruz-López, Leopoldo; Cunningham, Saul A; D'Aniello, Biagio; D'Cruze, Neil; da Silva, Pedro Giovâni; Dallimer, Martin; Danquah, Emmanuel; Darvill, Ben; Dauber, Jens; Davis, Adrian L V; Dawson, Jeff; de Sassi, Claudio; de Thoisy, Benoit; Deheuvels, Olivier; Dejean, Alain; Devineau, Jean-Louis; Diekötter, Tim; Dolia, Jignasu V; Domínguez, Erwin; Dominguez-Haydar, Yamileth; Dorn, Silvia; Draper, Isabel; Dreber, Niels; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Eggleton, Paul; Eigenbrod, Felix; Elek, Zoltán; Entling, Martin H; Esler, Karen J; de Lima, Ricardo F; Faruk, Aisyah; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Fensham, Roderick J; Fernandez, Ignacio C; Ferreira, Catarina C; Ficetola, Gentile F; Fiera, Cristina; Filgueiras, Bruno K C; Fırıncıoğlu, Hüseyin K; Flaspohler, David; Floren, Andreas; Fonte, Steven J; Fournier, Anne; Fowler, Robert E; Franzén, Markus; Fraser, Lauchlan H; Fredriksson, Gabriella M; Freire, Geraldo B; Frizzo, Tiago L M; Fukuda, Daisuke; Furlani, Dario; Gaigher, René; Ganzhorn, Jörg U; García, Karla P; Garcia-R, Juan C; Garden, Jenni G; Garilleti, Ricardo; Ge, Bao-Ming; Gendreau-Berthiaume, Benoit; Gerard, Philippa J; Gheler-Costa, Carla; Gilbert, Benjamin; Giordani, Paolo; Giordano, Simonetta; Golodets, Carly; Gomes, Laurens G L; Gould, Rachelle K; Goulson, Dave; Gove, Aaron D; Granjon, Laurent; Grass, Ingo; Gray, Claudia L; Grogan, James; Gu, Weibin; Guardiola, Moisès; Gunawardene, Nihara R; Gutierrez, Alvaro G; Gutiérrez-Lamus, Doris L; Haarmeyer, Daniela H; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hassan, Shombe N; Hatfield, Richard G; Hawes, Joseph E; Hayward, Matt W; Hébert, Christian; Helden, Alvin J; Henden, John-André; Henschel, Philipp; Hernández, Lionel; Herrera, James P; Herrmann, Farina; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Höfer, Hubert; Hoffmann, Anke; Horgan, Finbarr G; Hornung, Elisabeth; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishida, Hiroaki; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Hernández, F Jiménez; Johnson, McKenzie F; Jolli, Virat; Jonsell, Mats; Juliani, S Nur; Jung, Thomas S; Kapoor, Vena; Kappes, Heike; Kati, Vassiliki; Katovai, Eric; Kellner, Klaus; Kessler, Michael; Kirby, Kathryn R; Kittle, Andrew M; Knight, Mairi E; Knop, Eva; Kohler, Florian; Koivula, Matti; Kolb, Annette; Kone, Mouhamadou; Kőrösi, Ádám; Krauss, Jochen; Kumar, Ajith; Kumar, Raman; Kurz, David J; Kutt, Alex S; Lachat, Thibault; Lantschner, Victoria; Lara, Francisco; Lasky, Jesse R; Latta, Steven C; Laurance, William F; Lavelle, Patrick; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Lehouck, Valérie; Lencinas, María V; Lentini, Pia E; Letcher, Susan G; Li, Qi; Litchwark, Simon A; Littlewood, Nick A; Liu, Yunhui; Lo-Man-Hung, Nancy; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Luskin, Matthew S; MacSwiney G, M Cristina; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Malone, Louise A; Malonza, Patrick K; Malumbres-Olarte, Jagoba; Mandujano, Salvador; Måren, Inger E; Marin-Spiotta, Erika; Marsh, Charles J; Marshall, E J P; Martínez, Eliana; Martínez Pastur, Guillermo; Moreno Mateos, David; Mayfield, Margaret M; Mazimpaka, Vicente; McCarthy, Jennifer L; McCarthy, Kyle P; McFrederick, Quinn S; McNamara, Sean; Medina, Nagore G; Medina, Rafael; Mena, Jose L; Mico, Estefania; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Miranda-Esquivel, Daniel R; Moir, Melinda L; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Mudri-Stojnic, Sonja; Munira, A Nur; Muoñz-Alonso, Antonio; Munyekenye, B F; Naidoo, Robin; Naithani, A; Nakagawa, Michiko; Nakamura, Akihiro; Nakashima, Yoshihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Navarro-Iriarte, Luis; Ndang'ang'a, Paul K; Neuschulz, Eike L; Ngai, Jacqueline T; Nicolas, Violaine; Nilsson, Sven G; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Norton, David A; Nöske, Nicole M; Nowakowski, A Justin; Numa, Catherine; O'Dea, Niall; O'Farrell, Patrick J; Oduro, William; Oertli, Sabine; Ofori-Boateng, Caleb; Oke, Christopher Omamoke; Oostra, Vicencio; Osgathorpe, Lynne M; Otavo, Samuel Eduardo; Page, Navendu V; Paritsis, Juan; Parra-H, Alejandro; Parry, Luke; Pe'er, Guy; Pearman, Peter B; Pelegrin, Nicolás; Pélissier, Raphaël; Peres, Carlos A; Peri, Pablo L; Persson, Anna S; Petanidou, Theodora; Peters, Marcell K; Pethiyagoda, Rohan S; Phalan, Ben; Philips, T Keith; Pillsbury, Finn C; Pincheira-Ulbrich, Jimmy; Pineda, Eduardo; Pino, Joan; Pizarro-Araya, Jaime; Plumptre, A J; Poggio, Santiago L; Politi, Natalia; Pons, Pere; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Rader, Romina; Ramesh, B R; Ramirez-Pinilla, Martha P; Ranganathan, Jai; Rasmussen, Claus; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Rey Benayas, José M; Rey-Velasco, Juan Carlos; Reynolds, Chevonne; Ribeiro, Danilo Bandini; Richards, Miriam H; Richardson, Barbara A; Richardson, Michael J; Ríos, Rodrigo Macip; Robinson, Richard; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rös, Matthias; Rosselli, Loreta; Rossiter, Stephen J; Roth, Dana S; Roulston, T'ai H; Rousseau, Laurent; Rubio, André V; Ruel, Jean-Claude; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Sam, Katerina; Samnegård, Ulrika; Santana, Joana; Santos, Xavier; Savage, Jade; Schellhorn, Nancy A; Schilthuizen, Menno; Schmiedel, Ute; Schmitt, Christine B; Schon, Nicole L; Schüepp, Christof; Schumann, Katharina; Schweiger, Oliver; Scott, Dawn M; Scott, Kenneth A; Sedlock, Jodi L; Seefeldt, Steven S; Shahabuddin, Ghazala; Shannon, Graeme; Sheil, Douglas; Sheldon, Frederick H; Shochat, Eyal; Siebert, Stefan J; Silva, Fernando A B; Simonetti, Javier A; Slade, Eleanor M; Smith, Jo; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Soto Quiroga, Grimaldo; St-Laurent, Martin-Hugues; Starzomski, Brian M; Stefanescu, Constanti; Steffan-Dewenter, Ingolf; Stouffer, Philip C; Stout, Jane C; Strauch, Ayron M; Struebig, Matthew J; Su, Zhimin; Suarez-Rubio, Marcela; Sugiura, Shinji; Summerville, Keith S; Sung, Yik-Hei; Sutrisno, Hari; Svenning, Jens-Christian; Teder, Tiit; Threlfall, Caragh G; Tiitsaar, Anu; Todd, Jacqui H; Tonietto, Rebecca K; Torre, Ignasi; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Uehara-Prado, Marcio; Urbina-Cardona, Nicolas; Vallan, Denis; Vanbergen, Adam J; Vasconcelos, Heraldo L; Vassilev, Kiril; Verboven, Hans A F; Verdasca, Maria João; Verdú, José R; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Virgilio, Massimiliano; Vu, Lien Van; Waite, Edward M; Walker, Tony R; Wang, Hua-Feng; Wang, Yanping; Watling, James I; Weller, Britta; Wells, Konstans; Westphal, Catrin; Wiafe, Edward D; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Wolters, Volkmar; Woodcock, Ben A; Wu, Jihua; Wunderle, Joseph M; Yamaura, Yuichi; Yoshikura, Satoko; Yu, Douglas W; Zaitsev, Andrey S; Zeidler, Juliane; Zou, Fasheng; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make freely available this 2016 release of the database, containing more than 3.2 million records sampled at over 26,000 locations and representing over 47,000 species. We outline how the database can help in answering a range of questions in ecology and conservation biology. To our knowledge, this is the largest and most geographically and taxonomically representative database of spatial comparisons of biodiversity that has been collated to date; it will be useful to researchers and international efforts wishing to model and understand the global status of biodiversity.

  3. [Usefulness and limitations of rapid automatized naming to predict reading difficulties after school entry in preschool children].

    Science.gov (United States)

    Kaneko, Masato; Uno, Akira; Haruhara, Noriko; Awaya, Noriko

    2012-01-01

    We investigated the usability and limitations of Rapid Automatized Naming (RAN) results in 6-year-old Japanese preschool children to estimate whether reading difficulties will be encountered after school entry. We administered a RAN task to 1,001 preschool children. Then after they had entered school, we performed follow-up surveys yearly to assess their reading performance when these children were in the first, second, third and fourth grades. Also, we examined Hiragana non-words and Kanji words at each time point to detect the children who were having difficulty with reading Hiragana and Kanji. Results by Receiver Operating Characteristic analysis showed that the RAN result in 6-year-old preschool children was predictive of Kanji reading difficulty in the lower grades of elementary school, especially in the second grade with a probability of 0.86, and the area under the curve showed a probability of 0.84 in the third grade. These results suggested that the RAN task was useful as a screening tool.

  4. Comparative Human and Automatic Evaluation of Glass-Box and Black-Box Approaches to Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Torregrosa Daniel

    2017-06-01

    Full Text Available Interactive translation prediction (ITP is a modality of computer-aided translation that assists professional translators by offering context-based computer-generated continuation suggestions as they type. While most state-of-the-art ITP systems follow a glass-box approach, meaning that they are tightly coupled to an adapted machine translation system, a black-box approach which does not need access to the inner workings of the bilingual resources used to generate the suggestions has been recently proposed in the literature: this new approach allows new sources of bilingual information to be included almost seamlessly. In this paper, we compare for the first time the glass-box and the black-box approaches by means of an automatic evaluation of translation tasks between related languages such as English–Spanish and unrelated ones such as Arabic–English and English–Chinese, showing that, with our setup, 20%–50% of keystrokes could be saved using either method and that the black-box approach outperformed the glass-box one in five out of six scenarios operating under similar conditions. We also performed a preliminary human evaluation of English to Spanish translation for both approaches. On average, the evaluators saved 10% keystrokes and were 4% faster with the black-box approach, and saved 15% keystrokes and were 12% slower with the glass-box one; but they could have saved 51% and 69% keystrokes respectively if they had used all the compatible suggestions. Users felt the suggestions helped them to translate faster and easier. All the tools used to perform the evaluation are available as free/open–source software.

  5. Research on cross - Project software defect prediction based on transfer learning

    Science.gov (United States)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  6. Drought Prediction for Socio-Cultural Stability Project

    Science.gov (United States)

    Peters-Lidard, Christa; Eylander, John B.; Koster, Randall; Narapusetty, Balachandrudu; Kumar, Sujay; Rodell, Matt; Bolten, John; Mocko, David; Walker, Gregory; Arsenault, Kristi; hide

    2014-01-01

    The primary objective of this project is to answer the question: "Can existing, linked infrastructures be used to predict the onset of drought months in advance?" Based on our work, the answer to this question is "yes" with the qualifiers that skill depends on both lead-time and location, and especially with the associated teleconnections (e.g., ENSO, Indian Ocean Dipole) active in a given region season. As part of this work, we successfully developed a prototype drought early warning system based on existing/mature NASA Earth science components including the Goddard Earth Observing System Data Assimilation System Version 5 (GEOS-5) forecasting model, the Land Information System (LIS) land data assimilation software framework, the Catchment Land Surface Model (CLSM), remotely sensed terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE) and remotely sensed soil moisture products from the Aqua/Advanced Microwave Scanning Radiometer - EOS (AMSR-E). We focused on a single drought year - 2011 - during which major agricultural droughts occurred with devastating impacts in the Texas-Mexico region of North America (TEXMEX) and the Horn of Africa (HOA). Our results demonstrate that GEOS-5 precipitation forecasts show skill globally at 1-month lead, and can show up to 3 months skill regionally in the TEXMEX and HOA areas. Our results also demonstrate that the CLSM soil moisture percentiles are a goof indicator of drought, as compared to the North American Drought Monitor of TEXMEX and a combination of Famine Early Warning Systems Network (FEWS NET) data and Moderate Resolution Imaging Spectrometer (MODIS)'s Normalizing Difference Vegetation Index (NDVI) anomalies over HOA. The data assimilation experiments produced mixed results. GRACE terrestrial water storage (TWS) assimilation was found to significantly improve soil moisture and evapotransportation, as well as drought monitoring via soil moisture percentiles, while AMSR-E soil moisture

  7. Fast, accurate, and robust automatic marker detection for motion correction based on oblique kV or MV projection image pairs

    International Nuclear Information System (INIS)

    Slagmolen, Pieter; Hermans, Jeroen; Maes, Frederik; Budiharto, Tom; Haustermans, Karin; Heuvel, Frank van den

    2010-01-01

    Purpose: A robust and accurate method that allows the automatic detection of fiducial markers in MV and kV projection image pairs is proposed. The method allows to automatically correct for inter or intrafraction motion. Methods: Intratreatment MV projection images are acquired during each of five treatment beams of prostate cancer patients with four implanted fiducial markers. The projection images are first preprocessed using a series of marker enhancing filters. 2D candidate marker locations are generated for each of the filtered projection images and 3D candidate marker locations are reconstructed by pairing candidates in subsequent projection images. The correct marker positions are retrieved in 3D by the minimization of a cost function that combines 2D image intensity and 3D geometric or shape information for the entire marker configuration simultaneously. This optimization problem is solved using dynamic programming such that the globally optimal configuration for all markers is always found. Translational interfraction and intrafraction prostate motion and the required patient repositioning is assessed from the position of the centroid of the detected markers in different MV image pairs. The method was validated on a phantom using CT as ground-truth and on clinical data sets of 16 patients using manual marker annotations as ground-truth. Results: The entire setup was confirmed to be accurate to around 1 mm by the phantom measurements. The reproducibility of the manual marker selection was less than 3.5 pixels in the MV images. In patient images, markers were correctly identified in at least 99% of the cases for anterior projection images and 96% of the cases for oblique projection images. The average marker detection accuracy was 1.4±1.8 pixels in the projection images. The centroid of all four reconstructed marker positions in 3D was positioned within 2 mm of the ground-truth position in 99.73% of all cases. Detecting four markers in a pair of MV images

  8. Increasing Prediction the Original Final Year Project of Student Using Genetic Algorithm

    Science.gov (United States)

    Saragih, Rijois Iboy Erwin; Turnip, Mardi; Sitanggang, Delima; Aritonang, Mendarissan; Harianja, Eva

    2018-04-01

    Final year project is very important forgraduation study of a student. Unfortunately, many students are not seriouslydidtheir final projects. Many of studentsask for someone to do it for them. In this paper, an application of genetic algorithms to predict the original final year project of a studentis proposed. In the simulation, the data of the final project for the last 5 years is collected. The genetic algorithm has several operators namely population, selection, crossover, and mutation. The result suggest that genetic algorithm can do better prediction than other comparable model. Experimental results of predicting showed that 70% was more accurate than the previous researched.

  9. Proposal for future diagnosis and management of vascular tumors by using automatic software for image processing and statistic prediction.

    Science.gov (United States)

    Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I

    2015-01-01

    Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.

  10. Using sensor data patterns from an automatic milking system to develop predictive variables for classifying clinical mastitis and abnormal milk

    NARCIS (Netherlands)

    Kamphuis, A.; Pietersma, D.; Tol, van der R.; Wiedermann, M.; Hogeveen, H.

    2008-01-01

    Dairy farmers using automatic milking are able to manage mastitis successfully with the help of mastitis attention lists. These attention lists are generated with mastitis detection models that make use of sensor data obtained throughout each quarter milking. The models tend to be limited to using

  11. A Model Suggestion to Predict Leverage Ratio for Construction Projects

    Directory of Open Access Journals (Sweden)

    Özlem Tüz

    2013-12-01

    Full Text Available Due to the nature, construction is an industry with high uncertainty and risk. Construction industry carries high leverage ratios. Firms with low equities work in big projects through progress payment system, but in this case, even a small negative in the planned cash flows constitute a major risk for the company.The use of leverage, with a small investment to achieve profit targets large-scale, high-profit, but also brings a high risk with it. Investors may lose all or the portion of the money. In this study, monitoring and measuring of the leverage ratio because of the displacement in cash inflows of construction projects which uses high leverage and low cash to do business in the sector is targeted. Cash need because of drifting the cash inflows may be seen due to the model. Work should be done in the early stages of the project with little capital but in the later stages, rapidly growing capital need arises.The values obtained from the model may be used to supply the capital held in the right time by anticipating the risks because of the delay in cashflow of construction projects which uses high leverage ratio.

  12. Changes in automatic threat processing precede and predict clinical changes with exposure-based cognitive-behavior therapy for panic disorder.

    Science.gov (United States)

    Reinecke, Andrea; Waldenmaier, Lara; Cooper, Myra J; Harmer, Catherine J

    2013-06-01

    Cognitive behavioral therapy (CBT) is an effective treatment for emotional disorders such as anxiety or depression, but the mechanisms underlying successful intervention are far from understood. Although it has been a long-held view that psychopharmacological approaches work by directly targeting automatic emotional information processing in the brain, it is usually postulated that psychological treatments affect these processes only over time, through changes in more conscious thought cycles. This study explored the role of early changes in emotional information processing in CBT action. Twenty-eight untreated patients with panic disorder were randomized to a single session of exposure-based CBT or waiting group. Emotional information processing was measured on the day after intervention with an attentional visual probe task, and clinical symptoms were assessed on the day after intervention and at 4-week follow-up. Vigilance for threat information was decreased in the treated group, compared with the waiting group, the day after intervention, before reductions in clinical symptoms. The magnitude of this early effect on threat vigilance predicted therapeutic response after 4 weeks. Cognitive behavioral therapy rapidly affects automatic processing, and these early effects are predictive of later therapeutic change. Such results suggest very fast action on automatic processes mediating threat sensitivity, and they provide an early marker of treatment response. Furthermore, these findings challenge the notion that psychological treatments work directly on conscious thought processes before automatic information processing and imply a greater similarity between early effects of pharmacological and psychological treatments for anxiety than previously thought. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. New concepts in automatic enforcement. The "Escape" Project, Deliverable 6. Project funded by the European Commission under the Transport RTD Programme of the 4th Framework Programme.

    NARCIS (Netherlands)

    Heidstra, J. Goldenbeld, C. Mäkinen, T. Nilsson, G. & Sagberg, F.

    2010-01-01

    One main reason for automatic enforcement, except of the safety situation, is that the police will not be able to take direct action to each detected violator at normal police enforcement activities in some environments. By using detectors and camera technology the violators can be identified and

  14. ECLogger: Cross-Project Catch-Block Logging Prediction Using Ensemble of Classifiers

    Directory of Open Access Journals (Sweden)

    Sangeeta Lal

    2017-01-01

    Full Text Available Background: Software developers insert log statements in the source code to record program execution information. However, optimizing the number of log statements in the source code is challenging. Machine learning based within-project logging prediction tools, proposed in previous studies, may not be suitable for new or small software projects. For such software projects, we can use cross-project logging prediction. Aim: The aim of the study presented here is to investigate cross-project logging prediction methods and techniques. Method: The proposed method is ECLogger, which is a novel, ensemble-based, cross-project, catch-block logging prediction model. In the research We use 9 base classifiers were used and combined using ensemble techniques. The performance of ECLogger was evaluated on on three open-source Java projects: Tomcat, CloudStack and Hadoop. Results: ECLogger Bagging, ECLogger AverageVote, and ECLogger MajorityVote show a considerable improvement in the average Logged F-measure (LF on 3, 5, and 4 source -> target project pairs, respectively, compared to the baseline classifiers. ECLogger AverageVote performs best and shows improvements of 3.12% (average LF and 6.08% (average ACC – Accuracy. Conclusion: The classifier based on ensemble techniques, such as bagging, average vote, and majority vote outperforms the baseline classifier. Overall, the ECLogger AverageVote model performs best. The results show that the CloudStack project is more generalizable than the other projects.

  15. Feature Subset Selection and Instance Filtering for Cross-project Defect Prediction - Classification and Ranking

    Directory of Open Access Journals (Sweden)

    Faimison Porto

    2016-12-01

    Full Text Available The defect prediction models can be a good tool on organizing the project's test resources. The models can be constructed with two main goals: 1 to classify the software parts - defective or not; or 2 to rank the most defective parts in a decreasing order. However, not all companies maintain an appropriate set of historical defect data. In this case, a company can build an appropriate dataset from known external projects - called Cross-project Defect Prediction (CPDP. The CPDP models, however, present low prediction performances due to the heterogeneity of data. Recently, Instance Filtering methods were proposed in order to reduce this heterogeneity by selecting the most similar instances from the training dataset. Originally, the similarity is calculated based on all the available dataset features (or independent variables. We propose that using only the most relevant features on the similarity calculation can result in more accurate filtered datasets and better prediction performances. In this study we extend our previous work. We analyse both prediction goals - Classification and Ranking. We present an empirical evaluation of 41 different methods by associating Instance Filtering methods with Feature Selection methods. We used 36 versions of 11 open source projects on experiments. The results show similar evidences for both prediction goals. First, the defect prediction performance of CPDP models can be improved by associating Feature Selection and Instance Filtering. Second, no evaluated method presented general better performances. Indeed, the most appropriate method can vary according to the characteristics of the project being predicted.

  16. A Model Suggestion to Predict Leverage Ratio for Construction Projects

    OpenAIRE

    Özlem Tüz; Şafak Ebesek

    2013-01-01

    Due to the nature, construction is an industry with high uncertainty and risk. Construction industry carries high leverage ratios. Firms with low equities work in big projects through progress payment system, but in this case, even a small negative in the planned cash flows constitute a major risk for the company.The use of leverage, with a small investment to achieve profit targets large-scale, high-profit, but also brings a high risk with it. Investors may lose all or the portion of th...

  17. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  18. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  19. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  20. Commercial effectiveness assessment of implementation the energy efficiency raising of the building project due to introduction of automatic heat consumption control

    Directory of Open Access Journals (Sweden)

    Zvonareva Yu.N.

    2017-01-01

    Full Text Available Introduction of the automated metering stations and regulation (AUU located directly in the heated building besides creation of comfortable conditions indoors leads to decrease in consumption of thermal energy. The annual expected effect of realization of the offered actions (installation of metering stations and automatic control can make up to 22% consumed and that isn–t less important, the paid thermal energy. In general, efficiency of implementation of the project on introduction of AUU can be characterized by considerable decrease in heat consumption of the building and, respectively, reduction of a payment for the consumed energy resources. In this paper we evaluated the effectiveness of implementation of increase of energy efficiency of the building investment project (hereinafter SP. We calculated the ratio of expenses and the results considered actions for inhabitants of an apartment house located in Kazan after installation of a weather-dependent regulation. As a result of calculation of the imitating model created on the basis of basic data and the investment project plan the main results of determination of economic efficiency of the Project have been received. For the analysis and increase of reliability of a settlement assessment of efficiency of the investment project calculations at different options of a set of basic data are executed.

  1. Development of new geoinformation methods for modelling and prediction of sea level change over different timescales - overview of the project

    Science.gov (United States)

    Niedzielski, T.; Włosińska, M.; Miziński, B.; Hewelt, M.; Migoń, P.; Kosek, W.; Priede, I. G.

    2012-04-01

    The poster aims to provide a broad scientific audience with a general overview of a project on sea level change modelling and prediction that has just commenced at the University of Wrocław, Poland. The initiative that the project fits, called the Homing Plus programme, is organised by the Foundation for Polish Science and financially supported by the European Union through the European Regional Development Fund and the Innovative Economy Programme. There are two key research objectives of the project that complement each other. First, emphasis is put on modern satellite altimetric gridded time series from the Archiving, Validation and Interpretation of Satellite Oceanographic data (AVISO) repository. Daily sea level anomaly maps, access to which in near-real time is courtesy of AVISO, are being steadily downloaded every day to our local server in Wroclaw, Poland. These data will be processed within a general framework of modelling and prediction of sea level change in short, medium and long term. Secondly, sea level change over geological time is scrutinised in order to cover very long time scales that go far beyond a history of altimetric and tide-gauge measurements. The aforementioned approaches comprise a few tasks that aim to solve the following detailed problems. Within the first one, our objective is to seek spatio-temporal dependencies in the gridded sea level anomaly time series. Subsequently, predictions that make use of such cross-correlations shall be derived, and near-real time service for automatic update with validation will be implemented. Concurrently, (i.e. apart from spatio-temporal dependencies and their use in the process of forecasting variable sea level topography), threshold models shall be utilised for predicting the El Niño/Southern Oscillation (ENSO) signal that is normally present in sea level anomaly time series of the equatorial Pacific. Within the second approach, however, the entirely different methods are proposed. Links between

  2. Developing a stochastic traffic volume prediction model for public-private partnership projects

    Science.gov (United States)

    Phong, Nguyen Thanh; Likhitruangsilp, Veerasak; Onishi, Masamitsu

    2017-11-01

    Transportation projects require an enormous amount of capital investment resulting from their tremendous size, complexity, and risk. Due to the limitation of public finances, the private sector is invited to participate in transportation project development. The private sector can entirely or partially invest in transportation projects in the form of Public-Private Partnership (PPP) scheme, which has been an attractive option for several developing countries, including Vietnam. There are many factors affecting the success of PPP projects. The accurate prediction of traffic volume is considered one of the key success factors of PPP transportation projects. However, only few research works investigated how to predict traffic volume over a long period of time. Moreover, conventional traffic volume forecasting methods are usually based on deterministic models which predict a single value of traffic volume but do not consider risk and uncertainty. This knowledge gap makes it difficult for concessionaires to estimate PPP transportation project revenues accurately. The objective of this paper is to develop a probabilistic traffic volume prediction model. First, traffic volumes were estimated following the Geometric Brownian Motion (GBM) process. Monte Carlo technique is then applied to simulate different scenarios. The results show that this stochastic approach can systematically analyze variations in the traffic volume and yield more reliable estimates for PPP projects.

  3. Development of a wind farm noise propagation prediction model - project progress to date

    International Nuclear Information System (INIS)

    Robinson, P.; Bullmore, A.; Bass, J.; Sloth, E.

    1998-01-01

    This paper describes a twelve month measurement campaign which is part of a European project (CEC Project JOR3-CT95-0051) with the aim to substantially reduce the uncertainties involved in predicting environmentally radiated noise levels from wind farms (1). This will be achieved by comparing noise levels measure at varying distances from single and multiple sources over differing complexities of terrain with those predicted using a number of currently adopted sound propagation models. Specific objectives within the project are to: establish the important parameters controlling the propagation of wind farm noise to the far field; develop a planning tool for predicting wind farm noise emission levels under practically encountered conditions; place confidence limits on the upper and lower bounds of the noise levels predicted, thus enabling developers to quantify the risk whether noise emission from wind farms will cause nuisance to nearby residents. (Author)

  4. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm

    Science.gov (United States)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-02-01

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  5. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  6. ROBIN: a platform for evaluating automatic target recognition algorithms: I. Overview of the project and presentation of the SAGEM DS competition

    Science.gov (United States)

    Duclos, D.; Lonnoy, J.; Guillerm, Q.; Jurie, F.; Herbin, S.; D'Angelo, E.

    2008-04-01

    The last five years have seen a renewal of Automatic Target Recognition applications, mainly because of the latest advances in machine learning techniques. In this context, large collections of image datasets are essential for training algorithms as well as for their evaluation. Indeed, the recent proliferation of recognition algorithms, generally applied to slightly different problems, make their comparisons through clean evaluation campaigns necessary. The ROBIN project tries to fulfil these two needs by putting unclassified datasets, ground truths, competitions and metrics for the evaluation of ATR algorithms at the disposition of the scientific community. The scope of this project includes single and multi-class generic target detection and generic target recognition, in military and security contexts. From our knowledge, it is the first time that a database of this importance (several hundred thousands of visible and infrared hand annotated images) has been publicly released. Funded by the French Ministry of Defence (DGA) and by the French Ministry of Research, ROBIN is one of the ten Techno-vision projects. Techno-vision is a large and ambitious government initiative for building evaluation means for computer vision technologies, for various application contexts. ROBIN's consortium includes major companies and research centres involved in Computer Vision R&D in the field of defence: Bertin Technologies, CNES, ECA, DGA, EADS, INRIA, ONERA, MBDA, SAGEM, THALES. This paper, which first gives an overview of the whole project, is focused on one of ROBIN's key competitions, the SAGEM Defence Security database. This dataset contains more than eight hundred ground and aerial infrared images of six different vehicles in cluttered scenes including distracters. Two different sets of data are available for each target. The first set includes different views of each vehicle at close range in a "simple" background, and can be used to train algorithms. The second set

  7. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  8. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  9. Adapting the Water Erosion Prediction Project (WEPP) model for forest applications

    Science.gov (United States)

    Shuhui Dun; Joan Q. Wu; William J. Elliot; Peter R. Robichaud; Dennis C. Flanagan; James R. Frankenberger; Robert E. Brown; Arthur C. Xu

    2009-01-01

    There has been an increasing public concern over forest stream pollution by excessive sedimentation due to natural or human disturbances. Adequate erosion simulation tools are needed for sound management of forest resources. The Water Erosion Prediction Project (WEPP) watershed model has proved useful in forest applications where Hortonian flow is the major form of...

  10. Examination of imaging detectors for combined radiography procedures in the ACCIS joint project. Automatic cargo container inspection system. Final report

    International Nuclear Information System (INIS)

    Dangendorf, Volker

    2014-01-01

    Currently used screening systems of air cargo are based on X-ray radiation from bremsstrahlung generators. Thus, different substances from light elements of approximately the same density are difficult to distinguish, e.g. the image contrast between explosives and drugs is low compared to harmless organic substances, such as plastic parts or foodstuffs, and requires extensive follow-up investigations. On the other hand, the image contrast is also low in the case of heavy elements with X-ray methods, e.g. Special Nuclear Materials (SNM) such as Pu and.U, which are also transported in a container of lead for camouflage and mixed with goods from other heavy metals, makes it very difficult. Within the framework of the ACCIS Collaborative Project, a new inspection system for airfreight based on neutron and gamma irradiation was researched. Within this framework, the PTB subproject covered the following tasks: 1. Research and development of laboratory prototypes of imaging radiation detectors; 2. Development of a measuring station for the evaluation of the screening method at the PTB accelerator system, 3. Cooperation in the development of a concept for a pulsed radiation source, in particular design and investigation of the beam-producing target. 4. Determination of the physical and dosimetric parameters relevant to radiation protection. Examination of the conditions of application, requirement of operational facility, end user contacts; 6. Coordination of the German partners, in particular organization of the project meetings of the German and Israeli partners. [de

  11. ESIP's Earth Science Knowledge Graph (ESKG) Testbed Project: An Automatic Approach to Building Interdisciplinary Earth Science Knowledge Graphs to Improve Data Discovery

    Science.gov (United States)

    McGibbney, L. J.; Jiang, Y.; Burgess, A. B.

    2017-12-01

    Big Earth observation data have been produced, archived and made available online, but discovering the right data in a manner that precisely and efficiently satisfies user needs presents a significant challenge to the Earth Science (ES) community. An emerging trend in information retrieval community is to utilize knowledge graphs to assist users in quickly finding desired information from across knowledge sources. This is particularly prevalent within the fields of social media and complex multimodal information processing to name but a few, however building a domain-specific knowledge graph is labour-intensive and hard to keep up-to-date. In this work, we update our progress on the Earth Science Knowledge Graph (ESKG) project; an ESIP-funded testbed project which provides an automatic approach to building a dynamic knowledge graph for ES to improve interdisciplinary data discovery by leveraging implicit, latent existing knowledge present within across several U.S Federal Agencies e.g. NASA, NOAA and USGS. ESKG strengthens ties between observations and user communities by: 1) developing a knowledge graph derived from various sources e.g. Web pages, Web Services, etc. via natural language processing and knowledge extraction techniques; 2) allowing users to traverse, explore, query, reason and navigate ES data via knowledge graph interaction. ESKG has the potential to revolutionize the way in which ES communities interact with ES data in the open world through the entity, spatial and temporal linkages and characteristics that make it up. This project enables the advancement of ESIP collaboration areas including both Discovery and Semantic Technologies by putting graph information right at our fingertips in an interactive, modern manner and reducing the efforts to constructing ontology. To demonstrate the ESKG concept, we will demonstrate use of our framework across NASA JPL's PO.DAAC, NOAA's Earth Observation Requirements Evaluation System (EORES) and various USGS

  12. A prediction rule for the development of delirium among patients in medical wards: Chi-Square Automatic Interaction Detector (CHAID) decision tree analysis model.

    Science.gov (United States)

    Kobayashi, Daiki; Takahashi, Osamu; Arioka, Hiroko; Koga, Shinichiro; Fukui, Tsuguya

    2013-10-01

    To predict development of delirium among patients in medical wards by a Chi-Square Automatic Interaction Detector (CHAID) decision tree model. This was a retrospective cohort study of all adult patients admitted to medical wards at a large community hospital. The subject patients were randomly assigned to either a derivation or validation group (2:1) by computed random number generation. Baseline data and clinically relevant factors were collected from the electronic chart. Primary outcome was the development of delirium during hospitalization. All potential predictors were included in a forward stepwise logistic regression model. CHAID decision tree analysis was also performed to make another prediction model with the same group of patients. Receiver operating characteristic curves were drawn, and the area under the curves (AUCs) were calculated for both models. In the validation group, these receiver operating characteristic curves and AUCs were calculated based on the rules from derivation. A total of 3,570 patients were admitted: 2,400 patients assigned to the derivation group and 1,170 to the validation group. A total of 91 and 51 patients, respectively, developed delirium. Statistically significant predictors were delirium history, age, underlying malignancy, and activities of daily living impairment in CHAID decision tree model, resulting in six distinctive groups by the level of risk. AUC was 0.82 in derivation and 0.82 in validation with CHAID model and 0.78 in derivation and 0.79 in validation with logistic model. We propose a validated CHAID decision tree prediction model to predict the development of delirium among medical patients. Copyright © 2013 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. THE PROJECT OF ADMINISTRATIVE AND METHODICAL MANAGEMENT AUTOMATIZATION IN EDUCATIONAL INSTITUTION AS A TERM OF EDUCATION PROCESS QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Анна Игоревна Яценко

    2017-12-01

    Full Text Available The article is devoted to the practice of information technologies implementation in the educational process according to the condition of educational informatization. The actuality of main article concept is confirmed by the trend of widespread introduction of information technologies in education both from the state and from business. Taking into account the increased attention to acquiring of high results in the educational process, the information technology tools allows to significantly improve the quality of education. In this regard, the article provides examples of various information systems using in order to administer educational process, their advantages and disadvantages. In consequence, the author formulates the problem of lack of integrated information systems. However, the development of information technologies is oriented towards a worldwide network, which has an access to a vast audience of users. Educational institutions are involved in the electronic process supported by an electronic environment of the educational development. As a result of the issue study above and the modern trends review in the article the author suggests a project description of educational organization management optimization with the help of the integrated information system use on the Internet.

  14. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.; Hakiki, Farizal; Syihab, Z.; Ambia, F.; Yasutra, A.; Sutopo, S.; Efendi, M.; Sitompul, V.; Primasari, I.; Apriandi, R.

    2017-01-01

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  15. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.

    2017-10-17

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  16. Genomic Prediction from Whole Genome Sequence in Livestock: The 1000 Bull Genomes Project

    DEFF Research Database (Denmark)

    Hayes, Benjamin J; MacLeod, Iona M; Daetwyler, Hans D

    Advantages of using whole genome sequence data to predict genomic estimated breeding values (GEBV) include better persistence of accuracy of GEBV across generations and more accurate GEBV across breeds. The 1000 Bull Genomes Project provides a database of whole genome sequenced key ancestor bulls....... In a dairy data set, predictions using BayesRC and imputed sequence data from 1000 Bull Genomes were 2% more accurate than with 800k data. We could demonstrate the method identified causal mutations in some cases. Further improvements will come from more accurate imputation of sequence variant genotypes...

  17. SU-F-R-05: Multidimensional Imaging Radiomics-Geodesics: A Novel Manifold Learning Based Automatic Feature Extraction Method for Diagnostic Prediction in Multiparametric Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, V [The Johns Hopkins University, Computer Science. Baltimore, MD (United States); Jacobs, MA [The Johns Hopkins University School of Medicine, Dept of Radiology and Oncology. Baltimore, MD (United States)

    2016-06-15

    Purpose: Multiparametric radiological imaging is used for diagnosis in patients. Potentially extracting useful features specific to a patient’s pathology would be crucial step towards personalized medicine and assessing treatment options. In order to automatically extract features directly from multiparametric radiological imaging datasets, we developed an advanced unsupervised machine learning algorithm called the multidimensional imaging radiomics-geodesics(MIRaGe). Methods: Seventy-six breast tumor patients underwent 3T MRI breast imaging were used for this study. We tested the MIRaGe algorithm to extract features for classification of breast tumors into benign or malignant. The MRI parameters used were T1-weighted, T2-weighted, dynamic contrast enhanced MR imaging (DCE-MRI) and diffusion weighted imaging(DWI). The MIRaGe algorithm extracted the radiomics-geodesics features (RGFs) from multiparametric MRI datasets. This enable our method to learn the intrinsic manifold representations corresponding to the patients. To determine the informative RGF, a modified Isomap algorithm(t-Isomap) was created for a radiomics-geodesics feature space(tRGFS) to avoid overfitting. Final classification was performed using SVM. The predictive power of the RGFs was tested and validated using k-fold cross validation. Results: The RGFs extracted by the MIRaGe algorithm successfully classified malignant lesions from benign lesions with a sensitivity of 93% and a specificity of 91%. The top 50 RGFs identified as the most predictive by the t-Isomap procedure were consistent with the radiological parameters known to be associated with breast cancer diagnosis and were categorized as kinetic curve characterizing RGFs, wash-in rate characterizing RGFs, wash-out rate characterizing RGFs and morphology characterizing RGFs. Conclusion: In this paper, we developed a novel feature extraction algorithm for multiparametric radiological imaging. The results demonstrated the power of the MIRa

  18. SU-F-R-05: Multidimensional Imaging Radiomics-Geodesics: A Novel Manifold Learning Based Automatic Feature Extraction Method for Diagnostic Prediction in Multiparametric Imaging

    International Nuclear Information System (INIS)

    Parekh, V; Jacobs, MA

    2016-01-01

    Purpose: Multiparametric radiological imaging is used for diagnosis in patients. Potentially extracting useful features specific to a patient’s pathology would be crucial step towards personalized medicine and assessing treatment options. In order to automatically extract features directly from multiparametric radiological imaging datasets, we developed an advanced unsupervised machine learning algorithm called the multidimensional imaging radiomics-geodesics(MIRaGe). Methods: Seventy-six breast tumor patients underwent 3T MRI breast imaging were used for this study. We tested the MIRaGe algorithm to extract features for classification of breast tumors into benign or malignant. The MRI parameters used were T1-weighted, T2-weighted, dynamic contrast enhanced MR imaging (DCE-MRI) and diffusion weighted imaging(DWI). The MIRaGe algorithm extracted the radiomics-geodesics features (RGFs) from multiparametric MRI datasets. This enable our method to learn the intrinsic manifold representations corresponding to the patients. To determine the informative RGF, a modified Isomap algorithm(t-Isomap) was created for a radiomics-geodesics feature space(tRGFS) to avoid overfitting. Final classification was performed using SVM. The predictive power of the RGFs was tested and validated using k-fold cross validation. Results: The RGFs extracted by the MIRaGe algorithm successfully classified malignant lesions from benign lesions with a sensitivity of 93% and a specificity of 91%. The top 50 RGFs identified as the most predictive by the t-Isomap procedure were consistent with the radiological parameters known to be associated with breast cancer diagnosis and were categorized as kinetic curve characterizing RGFs, wash-in rate characterizing RGFs, wash-out rate characterizing RGFs and morphology characterizing RGFs. Conclusion: In this paper, we developed a novel feature extraction algorithm for multiparametric radiological imaging. The results demonstrated the power of the MIRa

  19. Are automatic systems the future of motorcycle safety? A novel methodology to prioritize potential safety solutions based on their projected effectiveness.

    Science.gov (United States)

    Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Baldanzini, Niccolò; Happee, Riender; Pierini, Marco

    2017-11-17

    Motorcycle riders are involved in significantly more crashes per kilometer driven than passenger car drivers. Nonetheless, the development and implementation of motorcycle safety systems lags far behind that of passenger cars. This research addresses the identification of the most effective motorcycle safety solutions in the context of different countries. A knowledge-based system of motorcycle safety (KBMS) was developed to assess the potential for various safety solutions to mitigate or avoid motorcycle crashes. First, a set of 26 common crash scenarios was identified from the analysis of multiple crash databases. Second, the relative effectiveness of 10 safety solutions was assessed for the 26 crash scenarios by a panel of experts. Third, relevant information about crashes was used to weigh the importance of each crash scenario in the region studied. The KBMS method was applied with an Italian database, with a total of more than 1 million motorcycle crashes in the period 2000-2012. When applied to the Italian context, the KBMS suggested that automatic systems designed to compensate for riders' or drivers' errors of commission or omission are the potentially most effective safety solution. The KBMS method showed an effective way to compare the potential of various safety solutions, through a scored list with the expected effectiveness of each safety solution for the region to which the crash data belong. A comparison of our results with a previous study that attempted a systematic prioritization of safety systems for motorcycles (PISa project) showed an encouraging agreement. Current results revealed that automatic systems have the greatest potential to improve motorcycle safety. Accumulating and encoding expertise in crash analysis from a range of disciplines into a scalable and reusable analytical tool, as proposed with the use of KBMS, has the potential to guide research and development of effective safety systems. As the expert assessment of the crash

  20. GI-POP: a combinational annotation and genomic island prediction pipeline for ongoing microbial genome projects.

    Science.gov (United States)

    Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi

    2013-04-10

    Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Mizunami Underground Research Laboratory project. A project on research stage of investigating prediction from ground surface. Project report at fiscal year of 2000 to 2004

    International Nuclear Information System (INIS)

    2000-04-01

    This was a detailed plan after fiscal year 2000 on the first stage of the Research stage at investigating prediction from ground surface' in three researches carried out at the Mizunami Underground Research Laboratory (MIU) according to the 'Basic plan on research of underground science at MIU', based on progress of investigation and research before fiscal year 1999. This project contains following three items as its general targets; establishment of general investigating techniques for geological environment, collection of informations on deep underground environment, and development on foundation of engineering technology at super-deep underground. And, targets at investigating prediction stage from ground surface contain acquisition of geological environment data through investigations from ground surface to predict changes of the environment accompanied with underground geological environment and construction of experimental tunnel, to determine evaluating method on prediction results, and to determine plannings of an investigating stage accompanied with excavation of the tunnel by carrying out detail design of the tunnel. Here were introduced about results and problems on the investigation of the first phase, the integration of investigating results, and the investigation and researches on geology/geological structure, hydrology and geochemistry of groundwater, mechanical properties of rocks, and the mass transfer. (G.K.)

  2. Rainfall and Extratropical Transition of Tropical Cyclones: Simulation, Prediction, and Projection

    Science.gov (United States)

    Liu, Maofeng

    Rainfall and associated flood hazards are one of the major threats of tropical cyclones (TCs) to coastal and inland regions. The interaction of TCs with extratropical systems can lead to enhanced precipitation over enlarged areas through extratropical transition (ET). To achieve a comprehensive understanding of rainfall and ET associated with TCs, this thesis conducts weather-scale analyses by focusing on individual storms and climate-scale analyses by focusing on seasonal predictability and changing properties of climatology under global warming. The temporal and spatial rainfall evolution of individual storms, including Hurricane Irene (2011), Hurricane Hanna (2008), and Hurricane Sandy (2012), is explored using the Weather Research and Forecast (WRF) model and a variety of hydrometeorological datasets. ET and Orographic mechanism are two key players in the rainfall distribution of Irene over regions experiencing most severe flooding. The change of TC rainfall under global warming is explored with the Forecast-oriented Low Ocean Resolution (FLOR) climate model under representative concentration pathway (RCP) 4.5 scenario. Despite decreased TC frequency, FLOR projects increased landfalling TC rainfall over most regions of eastern United States, highlighting the risk of increased flood hazards. Increased storm rain rate is an important player of increased landfalling TC rainfall. A higher atmospheric resolution version of FLOR (HiFLOR) model projects increased TC rainfall at global scales. The increase of TC intensity and environmental water vapor content scaled by the Clausius-Clapeyron relation are two key factors that explain the projected increase of TC rainfall. Analyses on the simulation, prediction, and projection of the ET activity with FLOR are conducted in the North Atlantic. FLOR model exhibits good skills in simulating many aspects of present-day ET climatology. The 21st-century-projection under RCP4.5 scenario demonstrates the dominant role of ET

  3. COMPARISON OF TREND PROJECTION METHODS AND BACKPROPAGATION PROJECTIONS METHODS TREND IN PREDICTING THE NUMBER OF VICTIMS DIED IN TRAFFIC ACCIDENT IN TIMOR TENGAH REGENCY, NUSA TENGGARA

    Directory of Open Access Journals (Sweden)

    Aleksius Madu

    2016-10-01

    Full Text Available The purpose of this study is to predict the number of traffic accident victims who died in Timor Tengah Regency with Trend Projection method and Backpropagation method, and compare the two methods based on the degree of guilt and predict the number traffic accident victims in the Timor Tengah Regency for the coming year. This research was conducted in Timor Tengah Regency where data used in this study was obtained from Police Unit in Timor Tengah Regency. The data is on the number of traffic accidents in Timor Tengah Regency from 2000 – 2013, which is obtained by a quantitative analysis with Trend Projection and Backpropagation method. The results of the data analysis predicting the number of traffic accidents victims using Trend Projection method obtained the best model which is the quadratic trend model with equation Yk = 39.786 + (3.297 X + (0.13 X2. Whereas by using back propagation method, it is obtained the optimum network that consists of 2 inputs, 3 hidden screens, and 1 output. Based on the error rates obtained, Back propagation method is better than the Trend Projection method which means that the predicting accuracy with Back propagation method is the best method to predict the number of traffic accidents victims in Timor Tengah Regency. Thus obtained predicting the numbers of traffic accident victims for the next 5 years (Years 2014-2018 respectively - are 106 person, 115 person, 115 person, 119 person and 120 person.   Keywords: Trend Projection, Back propagation, Predicting.

  4. Prediction of Protein Hotspots from Whole Protein Sequences by a Random Projection Ensemble System

    Directory of Open Access Journals (Sweden)

    Jinjian Jiang

    2017-07-01

    Full Text Available Hotspot residues are important in the determination of protein-protein interactions, and they always perform specific functions in biological processes. The determination of hotspot residues is by the commonly-used method of alanine scanning mutagenesis experiments, which is always costly and time consuming. To address this issue, computational methods have been developed. Most of them are structure based, i.e., using the information of solved protein structures. However, the number of solved protein structures is extremely less than that of sequences. Moreover, almost all of the predictors identified hotspots from the interfaces of protein complexes, seldom from the whole protein sequences. Therefore, determining hotspots from whole protein sequences by sequence information alone is urgent. To address the issue of hotspot predictions from the whole sequences of proteins, we proposed an ensemble system with random projections using statistical physicochemical properties of amino acids. First, an encoding scheme involving sequence profiles of residues and physicochemical properties from the AAindex1 dataset is developed. Then, the random projection technique was adopted to project the encoding instances into a reduced space. Then, several better random projections were obtained by training an IBk classifier based on the training dataset, which were thus applied to the test dataset. The ensemble of random projection classifiers is therefore obtained. Experimental results showed that although the performance of our method is not good enough for real applications of hotspots, it is very promising in the determination of hotspot residues from whole sequences.

  5. A GLOBAL ASSESSMENT OF SOLAR ENERGY RESOURCES: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    Science.gov (United States)

    Zhang, T.; Stackhouse, P. W., Jr.; Chandler, W.; Hoell, J. M.; Westberg, D.; Whitlock, C. H.

    2010-12-01

    NASA's POWER project, or the Prediction of the Worldwide Energy Resources project, synthesizes and analyzes data on a global scale. The products of the project find valuable applications in the solar and wind energy sectors of the renewable energy industries. The primary source data for the POWER project are NASA's World Climate Research Project (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Release 3.0) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (V 4.0.3). Users of the POWER products access the data through NASA's Surface meteorology and Solar Energy (SSE, Version 6.0) website (http://power.larc.nasa.gov). Over 200 parameters are available to the users. The spatial resolution is 1 degree by 1 degree now and will be finer later. The data covers from July 1983 to December 2007, a time-span of 24.5 years, and are provided as 3-hourly, daily and monthly means. As of now, there have been over 18 million web hits and over 4 million data file downloads. The POWER products have been systematically validated against ground-based measurements, and in particular, data from the Baseline Surface Radiation Network (BSRN) archive, and also against the National Solar Radiation Data Base (NSRDB). Parameters such as minimum, maximum, daily mean temperature and dew points, relative humidity and surface pressure are validated against the National Climate Data Center (NCDC) data. SSE feeds data directly into Decision Support Systems including RETScreen International clean energy project analysis software that is written in 36 languages and has greater than 260,000 users worldwide.

  6. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    Energy Technology Data Exchange (ETDEWEB)

    Gervasio, V.; Kim, D. S.; Vienna, J. D.; Kruger, A. A.

    2018-03-08

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints (Vienna et al. 2016), and uncertainty descriptions on projected Hanford glass mass. The maximum allowable waste oxide loading (WOL) was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of immobilized high-level waste (IHLW) glass when no uncertainties were taken into account. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increase in estimated glass mass of 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). The immobilized low-activity waste (ILAW) mass was predicted to be 282,350 MT without uncertainty and with waste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MT. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.

  7. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    Energy Technology Data Exchange (ETDEWEB)

    Gervasio, Vivianaluxa [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kim, Dong-Sang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kruger, Albert A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2018-02-19

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints (Vienna et al. 2016), and uncertainty descriptions on projected Hanford glass mass. The maximum allowable WOL was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of IHLW glass when no uncertainties were taken into accound. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increase in estimated glass mass 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). ILAW mass was predicted to be 282,350 MT without uncertainty and with weaste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MTG. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.

  8. A Hybrid Instance Selection Using Nearest-Neighbor for Cross-Project Defect Prediction

    Institute of Scientific and Technical Information of China (English)

    Duksan Ryu; Jong-In Jang; Jongmoon Baik; Member; ACM; IEEE

    2015-01-01

    Software defect prediction (SDP) is an active research field in software engineering to identify defect-prone modules. Thanks to SDP, limited testing resources can be effectively allocated to defect-prone modules. Although SDP requires suffcient local data within a company, there are cases where local data are not available, e.g., pilot projects. Companies without local data can employ cross-project defect prediction (CPDP) using external data to build classifiers. The major challenge of CPDP is different distributions between training and test data. To tackle this, instances of source data similar to target data are selected to build classifiers. Software datasets have a class imbalance problem meaning the ratio of defective class to clean class is far low. It usually lowers the performance of classifiers. We propose a Hybrid Instance Selection Using Nearest-Neighbor (HISNN) method that performs a hybrid classification selectively learning local knowledge (via k-nearest neighbor) and global knowledge (via na¨ıve Bayes). Instances having strong local knowledge are identified via nearest-neighbors with the same class label. Previous studies showed low PD (probability of detection) or high PF (probability of false alarm) which is impractical to use. The experimental results show that HISNN produces high overall performance as well as high PD and low PF.

  9. On the reliability of predictions of geomechanical response - project Cosa in perspective

    International Nuclear Information System (INIS)

    Knowles, N.C.; Lowe, M.J.S.; Come, B.

    1990-01-01

    Project COSA (Comparison of computer codes for Salt) was set up by the CEC as international benchmark exercise to compare the reliability of predictions of thermo-mechanical response of HLW repositories in salt. The first phase (COSA I) was conducted between 1984-1986 and attention was directed at code verification issues. The second phase (COSA II), carried out in the period 1986-1988, addressed code validation and other issues. Specifically, a series of experimental heat and pressure tests carried out at the Asse Mine in Wast Germany were modelled and predictions of the thermo-mechanical behaviour were compared. Ten European organisations participated. A key feature of this exercise was that, as far as possible, the calculations were performed blind (i.e. without any knowledge of the observed behaviour) using the best information available a priori, to describe the physical situation to be modelled. Interest centred around the various constitutive models (of material behaviour) for rock-salt and the assumptions about the in situ state of stress. The paper gives an overview of the project, presents some broad conclusions and attempts to assess their significance. 17 refs., 6 figs., 2 tabs

  10. Predicting outcome following psychological therapy in IAPT (PROMPT): a naturalistic project protocol.

    Science.gov (United States)

    Grant, Nina; Hotopf, Matthew; Breen, Gerome; Cleare, Anthony; Grey, Nick; Hepgul, Nilay; King, Sinead; Moran, Paul; Pariante, Carmine M; Wingrove, Janet; Young, Allan H; Tylee, André

    2014-06-09

    Depression and anxiety are highly prevalent and represent a significant and well described public health burden. Whilst first line psychological treatments are effective for nearly half of attenders, there remain a substantial number of patients who do not benefit. The main objective of the present project is to establish an infrastructure platform for the identification of factors that predict lack of response to psychological treatment for depression and anxiety, in order to better target treatments as well as to support translational and experimental medicine research in mood and anxiety disorders. Predicting outcome following psychological therapy in IAPT (PROMPT) is a naturalistic observational project that began patient recruitment in January 2014. The project is currently taking place in Southwark Psychological Therapies Service, an Improving Access to Psychological Therapies (IAPT) service currently provided by the South London and Maudsley NHS Foundation Trust (SLaM). However, the aim is to roll-out the project across other IAPT services. Participants are approached before beginning treatment and offered a baseline interview whilst they are waiting for therapy to begin. This allows us to test for relationships between predictor variables and patient outcome measures. At the baseline interview, participants complete a diagnostic interview; are asked to give blood and hair samples for relevant biomarkers, and complete psychological and social questionnaire measures. Participants then complete their psychological therapy as offered by Southwark Psychological Therapies Service. Response to psychological therapy will be measured using standard IAPT outcome data, which are routinely collected at each appointment. This project addresses a need to understand treatment response rates in primary care psychological therapy services for those with depression and/or anxiety. Measurement of a range of predictor variables allows for the detection of bio

  11. Prediction of irradiation damage effects by multi-scale modelling: EURATOM 3 Framework integrated project perfect

    International Nuclear Information System (INIS)

    Massoud, J.P.; Bugat, St.; Marini, B.; Lidbury, D.; Van Dyck, St.; Debarberis, L.

    2008-01-01

    Full text of publication follows. In nuclear PWRs, materials undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities operating these reactors must quantify the aging and the potential degradations of reactor pressure vessels and also of internal structures to ensure safe and reliable plant operation. The EURATOM 6. Framework Integrated Project PERFECT (Prediction of Irradiation Damage Effects in Reactor Components) addresses irradiation damage in RPV materials and components by multi-scale modelling. This state-of-the-art approach offers potential advantages over the conventional empirical methods used in current practice of nuclear plant lifetime management. Launched in January 2004, this 48-month project is focusing on two main components of nuclear power plants which are subject to irradiation damage: the ferritic steel reactor pressure vessel and the austenitic steel internals. This project is also an opportunity to integrate the fragmented research and experience that currently exists within Europe in the field of numerical simulation of radiation damage and creates the links with international organisations involved in similar projects throughout the world. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences make possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. The consequences of irradiation on mechanical and corrosion properties of materials are also tentatively modelled using such multi-scale modelling. But it requires to develop different mechanistic models at different levels of physics and engineering and to extend the state of knowledge in several scientific fields. And the links between these different kinds of models are particularly delicate to deal with and need specific works. Practically the main objective of PERFECT is to build

  12. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  13. Recording A Sunrise: A Citizen Science Project to Enhance Sunrise/set Prediction Times

    Science.gov (United States)

    Wilson, Teresa; Chizek Frouard, Malynda; Bartlett, Jennifer L.

    2017-01-01

    Smartphones, with their ever increasing capabilities, are becoming quite instrumental for data acquisition in a number of fields. Understanding refraction and how it affects what we see on the horizon is no exception. Current algorithms that predict sunrise and sunset times have an error of one to four minutes at mid-latitudes (0° - 55° N/S) due to limitations in the atmospheric models they incorporate. At higher latitudes, slight changes in refraction can cause significant discrepancies, even including difficulties determining when the Sun appears to rise or set. A thorough investigation of the problem requires a substantial data set of observed rise/set times and corresponding meteorological data from around the world, which is currently lacking. We have developed a mobile application so that this data can be taken using smartphones as part of a citizen science project. The app allows the viewer to submit a video of sunrise/set and attaches geographic location along with meteorological data taken from a local weather station. The project will help increase scientific awareness in the public by allowing members of the community to participate in the data-taking process, and give them a greater awareness of the scientific significance of phenomenon they witness every day. The data from the observations will lead to more complete rise/set models that will provide more accurate times to the benefit of astronomers, navigators, and outdoorsmen. The app will be available on the Google Play Store.

  14. The Accumulating Data to Optimally Predict Obesity Treatment (ADOPT) Core Measures Project: Rationale and Approach.

    Science.gov (United States)

    MacLean, Paul S; Rothman, Alexander J; Nicastro, Holly L; Czajkowski, Susan M; Agurs-Collins, Tanya; Rice, Elise L; Courcoulas, Anita P; Ryan, Donna H; Bessesen, Daniel H; Loria, Catherine M

    2018-04-01

    Individual variability in response to multiple modalities of obesity treatment is well documented, yet our understanding of why some individuals respond while others do not is limited. The etiology of this variability is multifactorial; however, at present, we lack a comprehensive evidence base to identify which factors or combination of factors influence treatment response. This paper provides an overview and rationale of the Accumulating Data to Optimally Predict obesity Treatment (ADOPT) Core Measures Project, which aims to advance the understanding of individual variability in response to adult obesity treatment. This project provides an integrated model for how factors in the behavioral, biological, environmental, and psychosocial domains may influence obesity treatment responses and identify a core set of measures to be used consistently across adult weight-loss trials. This paper provides the foundation for four companion papers that describe the core measures in detail. The accumulation of data on factors across the four ADOPT domains can inform the design and delivery of effective, tailored obesity treatments. ADOPT provides a framework for how obesity researchers can collectively generate this evidence base and is a first step in an ongoing process that can be refined as the science advances. © 2018 The Obesity Society.

  15. The DLR project Wirbelschleppe. Detecting, characterizing, controlling, attenuating, understanding, and predicting aircraft wake vortices

    Energy Technology Data Exchange (ETDEWEB)

    Holzaepfel, F. (ed.)

    2008-07-01

    This collection of reports presents an excerpt of the investigations that were performed in the framework of the DLR Projekt Wirbelschleppe. A similar sample of reports was presented as part of three dedicated wake vortex sessions accomplished at the 1{sup st} European Air and Space Conference (CEAS 2007) and Deutscher Luft- und Raumfahrtkongress 2007 in Berlin. The Projekt Wirbelschleppe was conducted in two phases in the time frame from 1999 to 2007 with the five contributing DLR Institutes: Institute of Atmospheric Physics, Institute of Aerodynamics and Flow Technology, Institute of Flight Systems, Institute of Flight Guidance, Institute of Robotics and Mechatronics and the Institute of Aeronautics and Astronautics of the University of Technology Berlin. The project unified a multitude of different aspects and disciplines of wake vortex research which can be characterized by four main themes: - minimization of wake vortices by measures at the aircraft; - development and demonstration of a system for wake vortex prediction and observation; - airborne wake vortex detection and active control; - integration of systems into air traffic control. The Projekt Wirbelschleppe greatly benefited from the European projects AWIATOR, ATC-Wake, Credos, C-Wake, Eurowake, FAR-Wake, FLYSAFE, I-Wake, S-Wake, WakeNet, WakeNet2-Europe, WakeNet3-Europe, and Wavenc. DLR's wake vortex activities will be continued in the Projekt Wetter and Fliegen (2008-2011): Because the current compilation represents only a limited extract of the accomplished work, it is completed by a list of references emerging from the project. (orig.)

  16. EKORISK project - an information system for prediction and expert evaluation of environmental impact

    International Nuclear Information System (INIS)

    Zaimov, V.; Antonov, A.

    1993-01-01

    The aim of this project is to create an expert system for prediction, evaluation and decision making support in case of accidents. The system consists of the following modules: 1) A data base containing information about the situation - geographical and demographical data for the region of the accident as well as data about the contaminants. The data about geographic objects (boundaries, rivers, roads, towns, soils, etc.) is managed and visualized by a geographic information system (GIS), which produces multi-layer geographical maps, showing different viewpoints of the region of interest. Information about the pollutants, their use and storage, as well as data about the available resources for action in case of accidents, are stored in relational data bases which guarantee easy access, search, sorting and proper visualisation. 2) Predicting the propagation of contamination by using actual meteorological information and applying mathematical models for propagation of the spilled substances in the air, water and ground. They calculate the concentration of the substance as a function of time and distance from the initial spill location. The choice of the proper model is made by applying expert knowledge for evaluation of situation and comparing the model characteristics. 3) Suggesting actions for minimising the accident's impact. Expert knowledge is used for recommendations concerning deactivating of the region as well as actions for reducing the absorbed radiation doses of population. The modern technologies for knowledge processing and the object-oriented approach ensure flexibility and integration of all subsystems. (author)

  17. EU Framework 6 Project: Predictive Toxicology (PredTox)-overview and outcome

    International Nuclear Information System (INIS)

    Suter, Laura; Schroeder, Susanne; Meyer, Kirstin; Gautier, Jean-Charles; Amberg, Alexander; Wendt, Maria; Gmuender, Hans; Mally, Angela; Boitier, Eric; Ellinger-Ziegelbauer, Heidrun; Matheis, Katja; Pfannkuch, Friedlieb

    2011-01-01

    In this publication, we report the outcome of the integrated EU Framework 6 Project: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and 'omics' technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach of 'omics' and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that 'omics' technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated 'omics' analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity.

  18. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    Science.gov (United States)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more

  19. Aeroheating Testing and Predictions for Project Orion CEV at Turbulent Conditions

    Science.gov (United States)

    Hollis, Brian R.; Berger, Karen T.; Horvath, Thomas J.; Coblish, Joseph J.; Norris, Joseph D.; Lillard, Randolph P.; Kirk, Benjamin S.

    2009-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Exploration Vehicle was performed in the Arnold Engineering Development Center Hypervelocity Wind Tunnel No. 9 Mach 8 and Mach 10 nozzles and in the NASA Langley Research Center 20 - Inch Mach 6 Air Tunnel. Heating data were obtained using a thermocouple-instrumented approx.0.035-scale model (0.1778-m/7-inch diameter) of the flight vehicle. Runs were performed in the Tunnel 9 Mach 10 nozzle at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 20x10(exp 6)/ft, in the Tunnel 9 Mach 8 nozzle at free stream unit Reynolds numbers of 8 x 10(exp 6)/ft to 48x10(exp 6)/ft, and in the 20-Inch Mach 6 Air Tunnel at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 7x10(exp 6)/ft. In both facilities, enthalpy levels were low and the test gas (N2 in Tunnel 9 and air in the 20-Inch Mach 6) behaved as a perfect-gas. These test conditions produced laminar, transitional and turbulent data in the Tunnel 9 Mach 10 nozzle, transitional and turbulent data in the Tunnel 9 Mach 8 nozzle, and laminar and transitional data in the 20- Inch Mach 6 Air Tunnel. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the experimental data to help define the accuracy of computational method. In general, it was found that both laminar data and predictions, and turbulent data and predictions, agreed to within less than the estimated 12% experimental uncertainty estimate. Laminar heating distributions from all three data sets were shown to correlate well and demonstrated Reynolds numbers independence when expressed in terms of the Stanton number based on adiabatic wall-recovery enthalpy. Transition onset locations on the leeside centerline were determined from the data and correlated in terms of boundary-layer parameters. Finally turbulent heating augmentation ratios were determined for several body-point locations and correlated in terms of the

  20. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  1. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    Science.gov (United States)

    Erin S. Brooks; Mariana Dobre; William J. Elliot; Joan Q. Wu; Jan Boll

    2016-01-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to...

  2. Projected climate change impacts and short term predictions on staple crops in Sub-Saharan Africa

    Science.gov (United States)

    Mereu, V.; Spano, D.; Gallo, A.; Carboni, G.

    2013-12-01

    . Multiple combinations of soils and climate conditions, crop management and varieties were considered for the different Agro-Ecological Zones. The climate impact was assessed using future climate prediction, statistically and/or dynamically downscaled, for specific areas. Direct and indirect effects of different CO2 concentrations projected for the future periods were separately explored to estimate their effects on crops. Several adaptation strategies (e.g., introduction of full irrigation, shift of the ordinary sowing/planting date, changes in the ordinary fertilization management) were also evaluated with the aim to reduce the negative impact of climate change on crop production. The results of the study, analyzed at local, AEZ and country level, will be discussed.

  3. The RadGenomics project. Prediction for radio-susceptibility of individuals with genetic predisposition

    International Nuclear Information System (INIS)

    Imai, Takashi

    2003-01-01

    The ultimate goal of our project, named RadGenomics, is to elucidate the heterogeneity of the response to ionizing radiation arising from genetic variation among individuals, for the purpose of developing personalized radiation therapy regimens for cancer patients. Cancer patients exhibit patient-to-patient variability in normal tissue reactions after radiotherapy. Several observations support the hypothesis that the radiosensitivity of normal tissue is influenced by genetic factors. The rapid progression of human genome sequencing and the recent development of new technologies in molecular biology are providing new opportunities for elucidating the genetic basis of individual differences in susceptibility to radiation exposure. The development of a sufficiently robust, predictive assay enabling individual dose adjustment would improve the outcome of radiation therapy in patients. Our strategy for identification of DNA polymorphisms that contribute to the individual radiosensitivity is as follows. First, we have been categorizing DNA samples obtained from cancer patients, who have been kindly introduced to us through many collaborators, according to their clinical characteristics including the method and effect of treatment and side effects as scored by toxicity criteria, and also the result of an in vitro radiosensitivity assay, e.g., the micronuclei assay of their lymphocytes. Second, we have identified candidate genes for genotyping mainly by using our custom-designed oligonucleotide array with RNA samples, in which the probes were obtained from more than 40 cancer and 3 fibroblast cell lines whose radiosensitivity level was quite heterogeneous. We have also been studying the modification of proteins after irradiation of cells which may be caused by mainly phosphorylation or dephosphorylation, using mass spectrometry. Genes encoding the modified proteins and/or other proteins with which they interact such as specific protein kinases and phosphatases are also

  4. Inhibition, Updating Working Memory, and Shifting Predict Reading Disability Symptoms in a Hybrid Model: Project KIDS.

    Science.gov (United States)

    Daucourt, Mia C; Schatschneider, Christopher; Connor, Carol M; Al Otaiba, Stephanie; Hart, Sara A

    2018-01-01

    Recent achievement research suggests that executive function (EF), a set of regulatory processes that control both thought and action necessary for goal-directed behavior, is related to typical and atypical reading performance. This project examines the relation of EF, as measured by its components, Inhibition, Updating Working Memory, and Shifting, with a hybrid model of reading disability (RD). Our sample included 420 children who participated in a broader intervention project when they were in KG-third grade (age M = 6.63 years, SD = 1.04 years, range = 4.79-10.40 years). At the time their EF was assessed, using a parent-report Behavior Rating Inventory of Executive Function (BRIEF), they had a mean age of 13.21 years ( SD = 1.54 years; range = 10.47-16.63 years). The hybrid model of RD was operationalized as a composite consisting of four symptoms, and set so that any child could have any one, any two, any three, any four, or none of the symptoms included in the hybrid model. The four symptoms include low word reading achievement, unexpected low word reading achievement, poorer reading comprehension compared to listening comprehension, and dual-discrepancy response-to-intervention, requiring both low achievement and low growth in word reading. The results of our multilevel ordinal logistic regression analyses showed a significant relation between all three components of EF (Inhibition, Updating Working Memory, and Shifting) and the hybrid model of RD, and that the strength of EF's predictive power for RD classification was the highest when RD was modeled as having at least one or more symptoms. Importantly, the chances of being classified as having RD increased as EF performance worsened and decreased as EF performance improved. The question of whether any one EF component would emerge as a superior predictor was also examined and results showed that Inhibition, Updating Working Memory, and Shifting were equally valuable as predictors of the hybrid model of RD

  5. Inhibition, Updating Working Memory, and Shifting Predict Reading Disability Symptoms in a Hybrid Model: Project KIDS

    Directory of Open Access Journals (Sweden)

    Mia C. Daucourt

    2018-03-01

    Full Text Available Recent achievement research suggests that executive function (EF, a set of regulatory processes that control both thought and action necessary for goal-directed behavior, is related to typical and atypical reading performance. This project examines the relation of EF, as measured by its components, Inhibition, Updating Working Memory, and Shifting, with a hybrid model of reading disability (RD. Our sample included 420 children who participated in a broader intervention project when they were in KG-third grade (age M = 6.63 years, SD = 1.04 years, range = 4.79–10.40 years. At the time their EF was assessed, using a parent-report Behavior Rating Inventory of Executive Function (BRIEF, they had a mean age of 13.21 years (SD = 1.54 years; range = 10.47–16.63 years. The hybrid model of RD was operationalized as a composite consisting of four symptoms, and set so that any child could have any one, any two, any three, any four, or none of the symptoms included in the hybrid model. The four symptoms include low word reading achievement, unexpected low word reading achievement, poorer reading comprehension compared to listening comprehension, and dual-discrepancy response-to-intervention, requiring both low achievement and low growth in word reading. The results of our multilevel ordinal logistic regression analyses showed a significant relation between all three components of EF (Inhibition, Updating Working Memory, and Shifting and the hybrid model of RD, and that the strength of EF’s predictive power for RD classification was the highest when RD was modeled as having at least one or more symptoms. Importantly, the chances of being classified as having RD increased as EF performance worsened and decreased as EF performance improved. The question of whether any one EF component would emerge as a superior predictor was also examined and results showed that Inhibition, Updating Working Memory, and Shifting were equally valuable as predictors of the

  6. Evaluation of numerical weather predictions performed in the context of the project DAPHNE

    Science.gov (United States)

    Tegoulias, Ioannis; Pytharoulis, Ioannis; Bampzelis, Dimitris; Karacostas, Theodore

    2014-05-01

    The region of Thessaly in central Greece is one of the main areas of agricultural production in Greece. Severe weather phenomena affect the agricultural production in this region with adverse effects for farmers and the national economy. For this reason the project DAPHNE aims at tackling the problem of drought by means of weather modification through the development of the necessary tools to support the application of a rainfall enhancement program. In the present study the numerical weather prediction system WRF-ARW is used, in order to assess its ability to represent extreme weather phenomena in the region of Thessaly. WRF is integrated in three domains covering Europe, Eastern Mediterranean and Central-Northern Greece (Thessaly and a large part of Macedonia) using telescoping nesting with grid spacing of 15km, 5km and 1.667km, respectively. The cases examined span throughout the transitional and warm period (April to September) of the years 2008 to 2013, including days with thunderstorm activity. Model results are evaluated against all available surface observations and radar products, taking into account the spatial characteristics and intensity of the storms. Preliminary results indicate a good level of agreement between the simulated and observed fields as far as the standard parameters (such as temperature, humidity and precipitation) are concerned. Moreover, the model generally exhibits a potential to represent the occurrence of the convective activity, but not its exact spatiotemporal characteristics. Acknowledgements This research work has been co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-2013)

  7. FY 1975 Report on results of Sunshine Project. Development of techniques of digging high-temperature beds (Conceptual designs of automatic digging systems); 1975 nendo koon chiso kussaku gijutsu no kaihatsu seika hokokusho. Jido kussaku system no gainen sekkei

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1976-03-31

    This project is aimed at development of automatic rotary table type digging apparatus capable of digging high-temperature beds of 3,000 to 5,000 m in depth, 400 degrees C and 500 kg/cm{sup 2}. The automatic excavation apparatus is designed based on the concept that the driller is separated from the running drawworks side for other works. However, it is operated manually in a traditional manner, when the bed for which it is to be used is under complex conditions, or when the bed properties require frequent changes in digging conditions with respect to bit load and speed of rotation. The apparatus works, when the bed is considered to be under essentially constant conditions at a fairly high depth, in such a way that a combination of bit load and speed of rotation is set by the driller, and the work line wound on the drawworks drum is let out to keep the bit load constant. It is equipped with, e.g., a controller, converter and brake operating device, detecting suspension load by the controller, when it increases, increasing speed of air motor after comparing the load with the reference level, and relaxing the brake, to keep the suspension load at the set level. When the suspension load decreases, on the other hand, the air motor is decelerated to tighten the brake. (NEDO)

  8. Physics-based Space Weather Forecasting in the Project for Solar-Terrestrial Environment Prediction (PSTEP) in Japan

    Science.gov (United States)

    Kusano, K.

    2016-12-01

    Project for Solar-Terrestrial Environment Prediction (PSTEP) is a Japanese nation-wide research collaboration, which was recently launched. PSTEP aims to develop a synergistic interaction between predictive and scientific studies of the solar-terrestrial environment and to establish the basis for next-generation space weather forecasting using the state-of-the-art observation systems and the physics-based models. For this project, we coordinate the four research groups, which develop (1) the integration of space weather forecast system, (2) the physics-based solar storm prediction, (3) the predictive models of magnetosphere and ionosphere dynamics, and (4) the model of solar cycle activity and its impact on climate, respectively. In this project, we will build the coordinated physics-based model to answer the fundamental questions concerning the onset of solar eruptions and the mechanism for radiation belt dynamics in the Earth's magnetosphere. In this paper, we will show the strategy of PSTEP, and discuss about the role and prospect of the physics-based space weather forecasting system being developed by PSTEP.

  9. Automatic Adviser on stationary devices status identification and anticipated change

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  10. Predicting the 10-Year Risks of Atherosclerotic Cardiovascular Disease in Chinese Population: The China-PAR Project (Prediction for ASCVD Risk in China).

    Science.gov (United States)

    Yang, Xueli; Li, Jianxin; Hu, Dongsheng; Chen, Jichun; Li, Ying; Huang, Jianfeng; Liu, Xiaoqing; Liu, Fangchao; Cao, Jie; Shen, Chong; Yu, Ling; Lu, Fanghong; Wu, Xianping; Zhao, Liancheng; Wu, Xigui; Gu, Dongfeng

    2016-11-08

    The accurate assessment of individual risk can be of great value to guiding and facilitating the prevention of atherosclerotic cardiovascular disease (ASCVD). However, prediction models in common use were formulated primarily in white populations. The China-PAR project (Prediction for ASCVD Risk in China) is aimed at developing and validating 10-year risk prediction equations for ASCVD from 4 contemporary Chinese cohorts. Two prospective studies followed up together with a unified protocol were used as the derivation cohort to develop 10-year ASCVD risk equations in 21 320 Chinese participants. The external validation was evaluated in 2 independent Chinese cohorts with 14 123 and 70 838 participants. Furthermore, model performance was compared with the Pooled Cohort Equations reported in the American College of Cardiology/American Heart Association guideline. Over 12 years of follow-up in the derivation cohort with 21 320 Chinese participants, 1048 subjects developed a first ASCVD event. Sex-specific equations had C statistics of 0.794 (95% confidence interval, 0.775-0.814) for men and 0.811 (95% confidence interval, 0.787-0.835) for women. The predicted rates were similar to the observed rates, as indicated by a calibration χ 2 of 13.1 for men (P=0.16) and 12.8 for women (P=0.17). Good internal and external validations of our equations were achieved in subsequent analyses. Compared with the Chinese equations, the Pooled Cohort Equations had lower C statistics and much higher calibration χ 2 values in men. Our project developed effective tools with good performance for 10-year ASCVD risk prediction among a Chinese population that will help to improve the primary prevention and management of cardiovascular disease. © 2016 American Heart Association, Inc.

  11. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  12. Differential subsidence and its effect on subsurface infrastructure: predicting probability of pipeline failure (STOOP project)

    Science.gov (United States)

    de Bruijn, Renée; Dabekaussen, Willem; Hijma, Marc; Wiersma, Ane; Abspoel-Bukman, Linda; Boeije, Remco; Courage, Wim; van der Geest, Johan; Hamburg, Marc; Harmsma, Edwin; Helmholt, Kristian; van den Heuvel, Frank; Kruse, Henk; Langius, Erik; Lazovik, Elena

    2017-04-01

    Due to heterogeneity of the subsurface in the delta environment of the Netherlands, differential subsidence over short distances results in tension and subsequent wear of subsurface infrastructure, such as water and gas pipelines. Due to uncertainties in the build-up of the subsurface, however, it is unknown where this problem is the most prominent. This is a problem for asset managers deciding when a pipeline needs replacement: damaged pipelines endanger security of supply and pose a significant threat to safety, yet premature replacement raises needless expenses. In both cases, costs - financial or other - are high. Therefore, an interdisciplinary research team of geotechnicians, geologists and Big Data engineers from research institutes TNO, Deltares and SkyGeo developed a stochastic model to predict differential subsidence and the probability of consequent pipeline failure on a (sub-)street level. In this project pipeline data from company databases is combined with a stochastic geological model and information on (historical) groundwater levels and overburden material. Probability of pipeline failure is modelled by a coupling with a subsidence model and two separate models on pipeline behaviour under stress, using a probabilistic approach. The total length of pipelines (approx. 200.000 km operational in the Netherlands) and the complexity of the model chain that is needed to calculate a chance of failure, results in large computational challenges, as it requires massive evaluation of possible scenarios to reach the required level of confidence. To cope with this, a scalable computational infrastructure has been developed, composing a model workflow in which components have a heterogeneous technological basis. Three pilot areas covering an urban, a rural and a mixed environment, characterised by different groundwater-management strategies and different overburden histories, are used to evaluate the differences in subsidence and uncertainties that come with

  13. Specifying general activity clusters for ERP projects aimed at effort prediction

    NARCIS (Netherlands)

    Janssens, G.; Kusters, R.J.; Heemstra, F.J.; Gunasekaran, A.; Shea, T.

    2010-01-01

    ERP implementation projects affect large parts of an implementing organization and lead to changes in the way an organization performs its tasks. The costs needed for the effort to implement these systems are hard to estimate. Research indicates that the size of an ERP project can be a useful

  14. Predicting and Mapping Potential Whooping Crane Stopover Habitat to Guide Site Selection for Wind Energy Projects

    Science.gov (United States)

    Migration is one of the most poorly understood components of a bird’s life cycle. For that reason, migratory stopover habitats are often not part of conservation planning and may be overlooked when planning new development projects. This project highlights and addresses an overl...

  15. An empirical evaluation of classification algorithms for fault prediction in open source projects

    Directory of Open Access Journals (Sweden)

    Arvinder Kaur

    2018-01-01

    Full Text Available Creating software with high quality has become difficult these days with the fact that size and complexity of the developed software is high. Predicting the quality of software in early phases helps to reduce testing resources. Various statistical and machine learning techniques are used for prediction of the quality of the software. In this paper, six machine learning models have been used for software quality prediction on five open source software. Varieties of metrics have been evaluated for the software including C & K, Henderson & Sellers, McCabe etc. Results show that Random Forest and Bagging produce good results while Naïve Bayes is least preferable for prediction.

  16. Simulation and Prediction of Groundwater Pollution from Planned Feed Additive Project in Nanning City Based on GMS Model

    Science.gov (United States)

    Liang, Yimin; Lan, Junkang; Wen, Zhixiong

    2018-01-01

    In order to predict the pollution of underground aquifers and rivers by the proposed project, Specialized hydrogeological investigation was carried out. After hydrogeological surveying and mapping, drilling, and groundwater level monitoring, the scope of the hydrogeological unit and the regional hydrogeological condition were found out. The permeability coefficients of the aquifers were also obtained by borehole water injection tests. In order to predict the impact on groundwater environment by the project, a GMS software was used in numerical simulation. The simulation results show that when unexpected sewage leakage accident happened, the pollutants will be gradually diluted by groundwater, and the diluted contaminants will slowly spread to southeast with groundwater flow, eventually they are discharged into Gantang River. However, the process of the pollutants discharging into the river is very long, the long-term dilution of the river water will keep Gantang River from being polluted.

  17. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  18. Phonological Awareness and Rapid Automatized Naming Predicting Early Development in Reading and Spelling: Results from a Cross-Linguistic Longitudinal Study

    Science.gov (United States)

    Furnes, Bjarte; Samuelsson, Stefan

    2010-01-01

    In this study, the relationship between latent constructs of phonological awareness (PA) and rapid automatized naming (RAN) were investigated and related to later measures of reading and spelling in children learning to read in different alphabetic writing systems (i.e., Norwegian/Swedish vs. English). 750 U.S./Australian children and 230 Scandinavian children were followed longitudinally between kindergarten and 2nd grade. PA and RAN were measured in kindergarten and Grade 1, while word recognition, phonological decoding, and spelling were measured in kindergarten, Grade 1, and Grade 2. In general, high stability was observed for the various reading and spelling measures, such that little additional variance was left open for PA and RAN. However, results demonstrated that RAN was more related to reading than spelling across orthographies, with the opposite pattern shown for PA. In addition, tests of measurement invariance show that the factor loadings of each observed indicator on the latent PA factor was the same across U.S./Australia and Scandinavia. Similar findings were obtained for RAN. In general, tests of structural invariance show that models of early literacy development are highly transferable across languages. PMID:21359098

  19. IRSS: a web-based tool for automatic layout and analysis of IRES secondary structure prediction and searching system in silico

    Directory of Open Access Journals (Sweden)

    Hong Jun-Jie

    2009-05-01

    Full Text Available Abstract Background Internal ribosomal entry sites (IRESs provide alternative, cap-independent translation initiation sites in eukaryotic cells. IRES elements are important factors in viral genomes and are also useful tools for bi-cistronic expression vectors. Most existing RNA structure prediction programs are unable to deal with IRES elements. Results We designed an IRES search system, named IRSS, to obtain better results for IRES prediction. RNA secondary structure prediction and comparison software programs were implemented to construct our two-stage strategy for the IRSS. Two software programs formed the backbone of IRSS: the RNAL fold program, used to predict local RNA secondary structures by minimum free energy method; and the RNA Align program, used to compare predicted structures. After complete viral genome database search, the IRSS have low error rate and up to 72.3% sensitivity in appropriated parameters. Conclusion IRSS is freely available at this website http://140.135.61.9/ires/. In addition, all source codes, precompiled binaries, examples and documentations are downloadable for local execution. This new search approach for IRES elements will provide a useful research tool on IRES related studies.

  20. Projectables

    DEFF Research Database (Denmark)

    Rasmussen, Troels A.; Merritt, Timothy R.

    2017-01-01

    CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...

  1. Predicting fatigue service life extension of RC bridges with externally bonded CFRP repairs : [project brief].

    Science.gov (United States)

    2015-12-01

    Externally bonded carbon fiber reinforced polymer composites (CFRPs) are increasingly used to : repair concrete bridges. CFRP design techniques are a proven approach for enhancing the strength : of existing structures. This project investigated the d...

  2. Automatic meter reading and PowerPlus services: Concept to implementation

    Energy Technology Data Exchange (ETDEWEB)

    Perks, D.R. [Alberta Power Ltd., Edmonton, AB (Canada)

    1995-12-31

    The Distribution Control System Inc.`s Two Way Automatic Communication System (TWACS) was implemented with GE Canada`s 170S automatic meter reader (AMR) at Alberta Power Ltd. Core and extended features are being marketed as PowerPlus{sup TM.} The technology used in the systems, design philosophy, systems components, outbound communication, inbound communication, throughput, and AMR were described. Objectives for the pilot project were to test reliability, accuracy and cost of implementation. Scope of the pilot, and project results were presented. Business aspects of PowerPlus{sup TM }marketing were described. Implementation schedule, constraints, technical problems, training, communication plan, strategy and 1994 year end status of the project were reviewed. Plans for continued development were described. It was predicted that the versatility of the TWACS system, and hard work of every department of Alberta Power will ensure that the implementation program will complete success. 5 figs.

  3. Demographic models and IPCC climate projections predict the decline of an emperor penguin population

    Science.gov (United States)

    Jenouvrier, Stéphanie; Caswell, Hal; Barbraud, Christophe; Holland, Marika; Strœve, Julienne; Weimerskirch, Henri

    2009-01-01

    Studies have reported important effects of recent climate change on Antarctic species, but there has been to our knowledge no attempt to explicitly link those results to forecasted population responses to climate change. Antarctic sea ice extent (SIE) is projected to shrink as concentrations of atmospheric greenhouse gases (GHGs) increase, and emperor penguins (Aptenodytes forsteri) are extremely sensitive to these changes because they use sea ice as a breeding, foraging and molting habitat. We project emperor penguin population responses to future sea ice changes, using a stochastic population model that combines a unique long-term demographic dataset (1962–2005) from a colony in Terre Adélie, Antarctica and projections of SIE from General Circulation Models (GCM) of Earth's climate included in the most recent Intergovernmental Panel on Climate Change (IPCC) assessment report. We show that the increased frequency of warm events associated with projected decreases in SIE will reduce the population viability. The probability of quasi-extinction (a decline of 95% or more) is at least 36% by 2100. The median population size is projected to decline from ≈6,000 to ≈400 breeding pairs over this period. To avoid extinction, emperor penguins will have to adapt, migrate or change the timing of their growth stages. However, given the future projected increases in GHGs and its effect on Antarctic climate, evolution or migration seem unlikely for such long lived species at the remote southern end of the Earth. PMID:19171908

  4. A Comparative Study between SVM and Fuzzy Inference System for the Automatic Prediction of Sleep Stages and the Assessment of Sleep Quality

    Directory of Open Access Journals (Sweden)

    John Gialelis

    2015-11-01

    Full Text Available This paper compares two supervised learning algorithms for predicting the sleep stages based on the human brain activity. The first step of the presented work regards feature extraction from real human electroencephalography (EEG data together with its corresponding sleep stages that are utilized for training a support vector machine (SVM, and a fuzzy inference system (FIS algorithm. Then, the trained algorithms are used to predict the sleep stages of real human patients. Extended comparison results are demonstrated which indicate that both classifiers could be utilized as a basis for an unobtrusive sleep quality assessment.

  5. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  6. Automatization Project for the Carl-Zeiss-Jena Coudè Telescope of the Simón Bolívar Planetarium I. The Electro-Mechanic System

    Science.gov (United States)

    Núñez, A.; Maharaj, A.; Muñoz, A. G.

    2009-05-01

    The ``Complejo Científico, Cultural y Turístico Simón Bolívar'' (CCCTSB), located in Maracaibo, Venezuela, lodges the Simón Bolívar Planetarium and an 150 mm aperture, 2250 mm focal length Carl-Zeiss-Jena Coudè refractor telescope. In this work we discuss the schematics for the automatization project of this Telescope, the planned improvements, methodology, engines, micro-controllers, interfaces and the uptodate status of the project. This project is working on the first two levels of the automation pyramid, the sensor -- actuator level and the control or Plant floor level. The Process control level correspond to the software related section. This mean that this project work immediately with the electrical, electronic and mechanical stuffs, and with the assembler micro controller language. All the pc related stuff, like GUI (Graphic user interfaces), remote control, Grid database, and others, correspond to the next two automation pyramid levels. The idea is that little human intervention will be required to manipulate the telescope, only giving a pair of coordinates to ubicate and follow an object on the sky. A set of three servomotors, coupling it with the telescope with a gear box, are going to manipulate right ascension, declination and focus movement. For the dome rotation, a three phase induction motor will be used. For dome aperture/closure it is suggested a DC motor powered with solar panels. All those actuators are controlled by a 8 bits micro-controller, which receive the coordinate imput, the signal from the position sensors and have the PID control algorithm. This algorithm is tuned based on the mathematical model of the telescope electro-mechanical instrumentation.

  7. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  8. Determination of polycyclic aromatic compounds. Part project 11: Practice-oriented adaption and field testing of various automatic measuring devices; Messung polycyclischer aromatischer Verbindungen. Teilvorhaben 11: Praxisbezogene Anpassung und Felderprobung verschiedener automatischer Messeinrichtungen

    Energy Technology Data Exchange (ETDEWEB)

    Wilbring, P.; Jockel, W.

    1997-05-01

    The purpose of the present study was to examine various automatic emission measuring devices. The task was to determine polycyclic aromatic hydrocarbons (PAH) on-line. The following measuring devices were used: photoelectric aerosol sensor; emission mass spectrometer; laser-induced aerosol fluorescence; chemical ionisation mass spectrometer; photoelectric aerosol sensor. Most of the above-named measuring devices for automatic PAH monitoring had already demonstrated their general suitability in the course of extensive studies carried out in precursor projects. The next step, performed in this study, was to test the measuring devices` fitness for use. First, practice-oriented laboratory tests were carried out on the devices, whose measuring principles are incidentally highly diverse. These tests focussed on the identification of process parameters (e.g., detection limit, cross-sensitivity, availability, drift behaviour) and examination of the devices` analysis function and hence of their calibratability. (orig./SR) [Deutsch] In diesem Vorhaben wurden verschiedene automatisch arbeitende Emissionsmesseinrichtungen untersucht. Die Aufgabenstellung bestand in der on-line-Bestimmung von polycyclischen aromatischen Kohlenwasserstoffen (PAK). Dazu standen folgende Messeinrichtungen zur Verfuegung: - Photoelektrischer Aerosolsensor - Emissionsmassenspektrometer - Laserinduzierte Aerosolfluoreszenz - Chemisches Ionisationsmassenspektrometer - Photoelektrischer Aerosolsensor. Aus den umfangreichen Untersuchungen im Rahmen der Vorlaeuferprojekte ist die prinzipielle Einsatzfaehigkeit der meisten der oben genannten Messeinrichtungen zur automatisierbaren PAK-Kontrolle bekannt. Ziel des Vorhabens war es, die Praxistauglichkeit dieser auf den unterschiedlichsten Messprinzipien basierenden Messeinrichtungen zu untersuchen. Dazu waren zunaechst Labortests vorzunehmen. Diese Pruefungen erfolgen praxisorientert mit den Schwerpunkten: - Ermittlung von Verfahrenskenngroessen (z. B

  9. Energy Yield Prediction of Offshore Wind Farm Clusters at the EERA-DTOC European Project

    DEFF Research Database (Denmark)

    Cantero, E.; Hasager, Charlotte Bay; Réthoré, Pierre-Elouan

    2014-01-01

    third-party models. Wake models have been benchmarked on the Horns Rev and, currently, on the Lilgrund wind farm test cases. Dedicated experiments from ‘BARD Offshore 1’ wind farm will using scanning lidars will produce new data for the validation of wake models. Furthermore, the project includes power...

  10. Energy yield prediction of offshore wind farm clusters at the EERA-DTOC European project

    DEFF Research Database (Denmark)

    Cantero, E.; Sanz, J.; Lorenzo, S.

    2013-01-01

    third-party models. Wake models have been benchmarked on the Horns Rev and, currently, on the Lilgrund wind farm test cases. Dedicated experiments from ‘BARD Offshore 1’ wind farm will using scanning lidars will produce new data for the validation of wake models. Furthermore, the project includes power...

  11. Spatial regression methods capture prediction uncertainty in species distribution model projections through time

    Science.gov (United States)

    Alan K. Swanson; Solomon Z. Dobrowski; Andrew O. Finley; James H. Thorne; Michael K. Schwartz

    2013-01-01

    The uncertainty associated with species distribution model (SDM) projections is poorly characterized, despite its potential value to decision makers. Error estimates from most modelling techniques have been shown to be biased due to their failure to account for spatial autocorrelation (SAC) of residual error. Generalized linear mixed models (GLMM) have the ability to...

  12. Prediction and optimization methods for electric vehicle charging schedules in the EDISON project

    DEFF Research Database (Denmark)

    Aabrandt, Andreas; Andersen, Peter Bach; Pedersen, Anders Bro

    2012-01-01

    project has been launched to investigate various areas relevant to electric vehicle integration. As part of EDISON an electric vehicle aggregator has been developed to demonstrate smart charging of electric vehicles. The emphasis of this paper is the mathematical methods on which the EDISON aggregator...

  13. Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

    CERN Multimedia

    1969-01-01

    Around the laboratories: Rutherford: Successful tests on bubble chamber target technique; Stanford (SLAC): New storage rings proposal; Berkeley: The HAPPE project to examine cosmic rays with superconducting magnets; The 60th birthday of Professor N.N. Bogolyubov; Argonne: Performance of the automatic film measuring system POLLY II

  14. Transdiagnostic Risk Calculator for the Automatic Detection of Individuals at Risk and the Prediction of Psychosis: Second Replication in an Independent National Health Service Trust.

    Science.gov (United States)

    Fusar-Poli, Paolo; Werbeloff, Nomi; Rutigliano, Grazia; Oliver, Dominic; Davies, Cathy; Stahl, Daniel; McGuire, Philip; Osborn, David

    2018-06-12

    The benefits of indicated primary prevention among individuals at Clinical High Risk for Psychosis (CHR-P) are limited by the difficulty in detecting these individuals. To overcome this problem, a transdiagnostic, clinically based, individualized risk calculator has recently been developed and subjected to a first external validation in 2 different catchment areas of the South London and Maudsley (SLaM) NHS Trust. Second external validation of real world, real-time electronic clinical register-based cohort study. All individuals who received a first ICD-10 index diagnosis of nonorganic and nonpsychotic mental disorder within the Camden and Islington (C&I) NHS Trust between 2009 and 2016 were included. The model previously validated included age, gender, ethnicity, age by gender, and ICD-10 index diagnosis to predict the development of any ICD-10 nonorganic psychosis. The model's performance was measured using Harrell's C-index. This study included a total of 13702 patients with an average age of 40 (range 16-99), 52% were female, and most were of white ethnicity (64%). There were no CHR-P or child/adolescent services in the C&I Trust. The C&I and SLaM Trust samples also differed significantly in terms of age, gender, ethnicity, and distribution of index diagnosis. Despite these significant differences, the original model retained an acceptable predictive performance (Harrell's C of 0.73), which is comparable to that of CHR-P tools currently recommended for clinical use. This risk calculator may pragmatically support an improved transdiagnostic detection of at-risk individuals and psychosis prediction even in NHS Trusts in the United Kingdom where CHR-P services are not provided.

  15. On the monitoring and prediction of flash floods in small and medium-sized catchments - the EXTRUSO project

    Science.gov (United States)

    Wiemann, Stefan; Eltner, Anette; Sardemann, Hannes; Spieler, Diana; Singer, Thomas; Thanh Luong, Thi; Janabi, Firas Al; Schütze, Niels; Bernard, Lars; Bernhofer, Christian; Maas, Hans-Gerd

    2017-04-01

    Flash floods regularly cause severe socio-economic damage worldwide. In parallel, climate change is very likely to increase the number of such events, due to an increasing frequency of extreme precipitation events (EASAC 2013). Whereas recent work primarily addresses the resilience of large catchment areas, the major impact of hydro-meteorological extremes caused by heavy precipitation is on small areas. Those are very difficult to observe and predict, due to sparse monitoring networks and only few means for hydro-meteorological modelling, especially in small catchment areas. The objective of the EXTRUSO project is to identify and implement appropriate means to close this gap by an interdisciplinary approach, combining comprehensive research expertise from meteorology, hydrology, photogrammetry and geoinformatics. The project targets innovative techniques for achieving spatio-temporal densified monitoring and simulations for the analysis, prediction and warning of local hydro-meteorological extreme events. The following four aspects are of particular interest: 1. The monitoring, analysis and combination of relevant hydro-meteorological parameters from various sources, including existing monitoring networks, ground radar, specific low-cost sensors and crowdsourcing. 2. The determination of relevant hydro-morphological parameters from different photogrammetric sensors (e.g. camera, laser scanner) and sensor platforms (e.g. UAV (unmanned aerial vehicle) and UWV (unmanned water vehicle)). 3. The continuous hydro-meteorological modelling of precipitation, soil moisture and water flows by means of conceptual and data-driven modelling. 4. The development of a collaborative, web-based service infrastructure as an information and communication point, especially in the case of an extreme event. There are three major applications for the planned information system: First, the warning of local extreme events for the population in potentially affected areas, second, the support

  16. GRECOS project. The use of genetics to predict the vascular recurrence after stroke

    Science.gov (United States)

    Fernández-Cadenas, Israel; Mendióroz, Maite; Giralt, Dolors; Nafria, Cristina; Garcia, Elena; Carrera, Caty; Gallego-Fabrega, Cristina; Domingues-Montanari, Sophie; Delgado, Pilar; Ribó, Marc; Castellanos, Mar; Martínez, Sergi; Freijo, Mari Mar; Jiménez-Conde, Jordi; Rubiera, Marta; Alvarez-Sabín, José; Molina, Carlos A.; Font, Maria Angels; Olivares, Marta Grau; Palomeras, Ernest; de la Ossa, Natalia Perez; Martinez-Zabaleta, Maite; Masjuan, Jaime; Moniche, Francisco; Canovas, David; Piñana, Carlos; Purroy, Francisco; Cocho, Dolores; Navas, Inma; Tejero, Carlos; Aymerich, Nuria; Cullell, Natalia; Muiño, Elena; Serena, Joaquín; Rubio, Francisco; Davalos, Antoni; Roquer, Jaume; Arenillas, Juan Francisco; Martí-Fábregas, Joan; Keene, Keith; Chen, Wei-Min; Worrall, Bradford; Sale, Michele; Arboix, Adrià; Krupinski, Jerzy; Montaner, Joan

    2017-01-01

    Background and Purpose Vascular recurrence occurs in 11% of patients during the first year after ischemic stroke (IS) or transient ischemic attack (TIA). Clinical scores do not predict the whole vascular recurrence risk, therefore we aimed to find genetic variants associated with recurrence that might improve the clinical predictive models in IS. Methods We analyzed 256 polymorphisms from 115 candidate genes in three patient cohorts comprising 4,482 IS or TIA patients. The discovery cohort was prospectively recruited and included 1,494 patients, 6.2% of them developed a new IS during the first year of follow-up. Replication analysis was performed in 2,988 patients using SNPlex or HumanOmni1-Quad technology. We generated a predictive model using Cox regression (GRECOS score), and generated risk groups using a classification tree method. Results The analyses revealed that rs1800801 in the MGP gene (HR: 1.33, p= 9×10−03), a gene related to artery calcification, was associated with new IS during the first year of follow-up. This polymorphism was replicated in a Spanish cohort (n=1.305), however it was not significantly associated in a North American cohort (n=1.683). The GRECOS score predicted new IS (p= 3.2×10−09) and could classify patients, from low risk of stroke recurrence (1.9%) to high risk (12.6%). Moreover, the addition of genetic risk factors to the GRECOS score improves the prediction compared to previous SPI-II score (p=0.03). Conclusions The use of genetics could be useful to estimate vascular recurrence risk after IS. Genetic variability in the MGP gene was associated with vascular recurrence in the Spanish population. PMID:28411264

  17. Predicting persistence in the sediment compartment with a new automatic software based on the k-Nearest Neighbor (k-NN) algorithm.

    Science.gov (United States)

    Manganaro, Alberto; Pizzo, Fabiola; Lombardo, Anna; Pogliaghi, Alberto; Benfenati, Emilio

    2016-02-01

    The ability of a substance to resist degradation and persist in the environment needs to be readily identified in order to protect the environment and human health. Many regulations require the assessment of persistence for substances commonly manufactured and marketed. Besides laboratory-based testing methods, in silico tools may be used to obtain a computational prediction of persistence. We present a new program to develop k-Nearest Neighbor (k-NN) models. The k-NN algorithm is a similarity-based approach that predicts the property of a substance in relation to the experimental data for its most similar compounds. We employed this software to identify persistence in the sediment compartment. Data on half-life (HL) in sediment were obtained from different sources and, after careful data pruning the final dataset, containing 297 organic compounds, was divided into four experimental classes. We developed several models giving satisfactory performances, considering that both the training and test set accuracy ranged between 0.90 and 0.96. We finally selected one model which will be made available in the near future in the freely available software platform VEGA. This model offers a valuable in silico tool that may be really useful for fast and inexpensive screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  19. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  20. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  1. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  2. Evaluation of Different Topographic Corrections for Landsat TM Data by Prediction of Foliage Projective Cover (FPC in Topographically Complex Landscapes

    Directory of Open Access Journals (Sweden)

    Sisira Ediriweera

    2013-12-01

    Full Text Available The reflected radiance in topographically complex areas is severely affected by variations in topography; thus, topographic correction is considered a necessary pre-processing step when retrieving biophysical variables from these images. We assessed the performance of five topographic corrections: (i C correction (C, (ii Minnaert, (iii Sun Canopy Sensor (SCS, (iv SCS + C and (v the Processing Scheme for Standardised Surface Reflectance (PSSSR on the Landsat-5 Thematic Mapper (TM reflectance in the context of prediction of Foliage Projective Cover (FPC in hilly landscapes in north-eastern Australia. The performance of topographic corrections on the TM reflectance was assessed by (i visual comparison and (ii statistically comparing TM predicted FPC with ground measured FPC and LiDAR (Light Detection and Ranging-derived FPC estimates. In the majority of cases, the PSSSR method performed best in terms of eliminating topographic effects, providing the best relationship and lowest residual error when comparing ground measured FPC and LiDAR FPC with TM predicted FPC. The Minnaert, C and SCS + C showed the poorest performance. Finally, the use of TM surface reflectance, which includes atmospheric correction and broad Bidirectional Reflectance Distribution Function (BRDF effects, seemed to account for most topographic variation when predicting biophysical variables, such as FPC.

  3. Student nurse selection and predictability of academic success: The Multiple Mini Interview project.

    Science.gov (United States)

    Gale, Julia; Ooms, Ann; Grant, Robert; Paget, Kris; Marks-Maran, Di

    2016-05-01

    With recent reports of public enquiries into failure to care, universities are under pressure to ensure that candidates selected for undergraduate nursing programmes demonstrate academic potential as well as characteristics and values such as compassion, empathy and integrity. The Multiple Mini Interview (MMI) was used in one university as a way of ensuring that candidates had the appropriate numeracy and literacy skills as well as a range of communication, empathy, decision-making and problem-solving skills as well as ethical insights and integrity, initiative and team-work. To ascertain whether there is evidence of bias in MMIs (gender, age, nationality and location of secondary education) and to determine the extent to which the MMI is predictive of academic success in nursing. A longitudinal retrospective analysis of student demographics, MMI data and the assessment marks for years 1, 2 and 3. One university in southwest London. One cohort of students who commenced their programme in September 2011, including students in all four fields of nursing (adult, child, mental health and learning disability). Inferential statistics and a Bayesian Multilevel Model. MMI in conjunction with MMI numeracy test and MMI literacy test shows little or no bias in terms of ages, gender, nationality or location of secondary school education. Although MMI in conjunction with numeracy and literacy testing is predictive of academic success, it is only weakly predictive. The MMI used in conjunction with literacy and numeracy testing appears to be a successful technique for selecting candidates for nursing. However, other selection methods such as psychological profiling or testing of emotional intelligence may add to the extent to which selection methods are predictive of academic success on nursing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Performance of the operational high-resolution numerical weather predictions of the Daphne project

    Science.gov (United States)

    Tegoulias, Ioannis; Pytharoulis, Ioannis; Karacostas, Theodore; Kartsios, Stergios; Kotsopoulos, Stelios; Bampzelis, Dimitrios

    2015-04-01

    In the framework of the DAPHNE project, the Department of Meteorology and Climatology (http://meteo.geo.auth.gr) of the Aristotle University of Thessaloniki, Greece, utilizes the nonhydrostatic Weather Research and Forecasting model with the Advanced Research dynamic solver (WRF-ARW) in order to produce high-resolution weather forecasts over Thessaly in central Greece. The aim of the DAPHNE project is to tackle the problem of drought in this area by means of Weather Modification. Cloud seeding assists the convective clouds to produce rain more efficiently or reduce hailstone size in favour of raindrops. The most favourable conditions for such a weather modification program in Thessaly occur in the period from March to October when convective clouds are triggered more frequently. Three model domains, using 2-way telescoping nesting, cover: i) Europe, the Mediterranean sea and northern Africa (D01), ii) Greece (D02) and iii) the wider region of Thessaly (D03; at selected periods) at horizontal grid-spacings of 15km, 5km and 1km, respectively. This research work intents to describe the atmospheric model setup and analyse its performance during a selected period of the operational phase of the project. The statistical evaluation of the high-resolution operational forecasts is performed using surface observations, gridded fields and radar data. Well established point verification methods combined with novel object based upon these methods, provide in depth analysis of the model skill. Spatial characteristics are adequately captured but a variable time lag between forecast and observation is noted. Acknowledgments: This research work has been co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness

  5. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  6. Joint Applications Pilot of the National Climate Predictions and Projections Platform and the North Central Climate Science Center: Delivering climate projections on regional scales to support adaptation planning

    Science.gov (United States)

    Ray, A. J.; Ojima, D. S.; Morisette, J. T.

    2012-12-01

    The DOI North Central Climate Science Center (NC CSC) and the NOAA/NCAR National Climate Predictions and Projections (NCPP) Platform and have initiated a joint pilot study to collaboratively explore the "best available climate information" to support key land management questions and how to provide this information. NCPP's mission is to support state of the art approaches to develop and deliver comprehensive regional climate information and facilitate its use in decision making and adaptation planning. This presentation will describe the evolving joint pilot as a tangible, real-world demonstration of linkages between climate science, ecosystem science and resource management. Our joint pilot is developing a deliberate, ongoing interaction to prototype how NCPP will work with CSCs to develop and deliver needed climate information products, including translational information to support climate data understanding and use. This pilot also will build capacity in the North Central CSC by working with NCPP to use climate information used as input to ecological modeling. We will discuss lessons to date on developing and delivering needed climate information products based on this strategic partnership. Four projects have been funded to collaborate to incorporate climate information as part of an ecological modeling project, which in turn will address key DOI stakeholder priorities in the region: Riparian Corridors: Projecting climate change effects on cottonwood and willow seed dispersal phenology, flood timing, and seedling recruitment in western riparian forests. Sage Grouse & Habitats: Integrating climate and biological data into land management decision models to assess species and habitat vulnerability Grasslands & Forests: Projecting future effects of land management, natural disturbance, and CO2 on woody encroachment in the Northern Great Plains The value of climate information: Supporting management decisions in the Plains and Prairie Potholes LCC. NCCSC's role in

  7. Development of Procedures for Assessing the Impact of Vocational Education Research and Development on Vocational Education (Project IMPACT). Volume 8--A Field Study of Predicting Impact of Research and Development Projects in Vocational and Technical Education.

    Science.gov (United States)

    Malhorta, Man Mohanlal

    As part of Project IMPACT's effort to identify and develop procedures for complying with the impact requirements of Public Law 94-482, a field study was conducted to identify and validate variables and their order of importance in predicting and evaluating impact of research and development (R&D) projects in vocational and technical education.…

  8. Prediction of optimal deployment projection for transcatheter aortic valve replacement: angiographic 3-dimensional reconstruction of the aortic root versus multidetector computed tomography.

    OpenAIRE

    Binder Ronald K; Leipsic Jonathon; Wood David; Moore Teri; Toggweiler Stefan; Willson Alex; Gurvitch Ronen; Freeman Melanie; Webb John G

    2012-01-01

    BACKGROUND Identifying the optimal fluoroscopic projection of the aortic valve is important for successful transcatheter aortic valve replacement (TAVR). Various imaging modalities including multidetector computed tomography (MDCT) have been proposed for prediction of the optimal deployment projection. We evaluated a method that provides 3 dimensional angiographic reconstructions (3DA) of the aortic root for prediction of the optimal deployment angle and compared it with MDCT. METHODS AND RES...

  9. Predicting diabetes mellitus using SMOTE and ensemble machine learning approach: The Henry Ford ExercIse Testing (FIT) project.

    Science.gov (United States)

    Alghamdi, Manal; Al-Mallah, Mouaz; Keteyian, Steven; Brawner, Clinton; Ehrman, Jonathan; Sakr, Sherif

    2017-01-01

    Machine learning is becoming a popular and important approach in the field of medical research. In this study, we investigate the relative performance of various machine learning methods such as Decision Tree, Naïve Bayes, Logistic Regression, Logistic Model Tree and Random Forests for predicting incident diabetes using medical records of cardiorespiratory fitness. In addition, we apply different techniques to uncover potential predictors of diabetes. This FIT project study used data of 32,555 patients who are free of any known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems between 1991 and 2009 and had a complete 5-year follow-up. At the completion of the fifth year, 5,099 of those patients have developed diabetes. The dataset contained 62 attributes classified into four categories: demographic characteristics, disease history, medication use history, and stress test vital signs. We developed an Ensembling-based predictive model using 13 attributes that were selected based on their clinical importance, Multiple Linear Regression, and Information Gain Ranking methods. The negative effect of the imbalance class of the constructed model was handled by Synthetic Minority Oversampling Technique (SMOTE). The overall performance of the predictive model classifier was improved by the Ensemble machine learning approach using the Vote method with three Decision Trees (Naïve Bayes Tree, Random Forest, and Logistic Model Tree) and achieved high accuracy of prediction (AUC = 0.92). The study shows the potential of ensembling and SMOTE approaches for predicting incident diabetes using cardiorespiratory fitness data.

  10. Predicting diabetes mellitus using SMOTE and ensemble machine learning approach: The Henry Ford ExercIse Testing (FIT project.

    Directory of Open Access Journals (Sweden)

    Manal Alghamdi

    Full Text Available Machine learning is becoming a popular and important approach in the field of medical research. In this study, we investigate the relative performance of various machine learning methods such as Decision Tree, Naïve Bayes, Logistic Regression, Logistic Model Tree and Random Forests for predicting incident diabetes using medical records of cardiorespiratory fitness. In addition, we apply different techniques to uncover potential predictors of diabetes. This FIT project study used data of 32,555 patients who are free of any known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems between 1991 and 2009 and had a complete 5-year follow-up. At the completion of the fifth year, 5,099 of those patients have developed diabetes. The dataset contained 62 attributes classified into four categories: demographic characteristics, disease history, medication use history, and stress test vital signs. We developed an Ensembling-based predictive model using 13 attributes that were selected based on their clinical importance, Multiple Linear Regression, and Information Gain Ranking methods. The negative effect of the imbalance class of the constructed model was handled by Synthetic Minority Oversampling Technique (SMOTE. The overall performance of the predictive model classifier was improved by the Ensemble machine learning approach using the Vote method with three Decision Trees (Naïve Bayes Tree, Random Forest, and Logistic Model Tree and achieved high accuracy of prediction (AUC = 0.92. The study shows the potential of ensembling and SMOTE approaches for predicting incident diabetes using cardiorespiratory fitness data.

  11. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  12. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  13. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  14. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  15. Prediction of single-component NAPL behavior for the TEVES Project using T2VOC

    International Nuclear Information System (INIS)

    Webb, S.W.; Phelan, J.M.

    1995-01-01

    Detailed simulations have been performed for the TEVES (Thermal Enhanced Vapor Extraction System) Project using the TOUGH2 code considering air, water, and a single-component NAPL. A critical parameter varied in the simulations is the borehole vacuum which directly affects air flow through the system and indirectly influences soil temperatures and water and NAPL fluid masses. Contaminant migration from the heated zone into the unheated soil can occur if the borehole vacuum, or borehole flow rate, is not sufficient. Under these conditions, evaporation of liquids (water and NAPL) due to the heating can cause flow from the heated zone into the unheated soil. Insufficient air sweep may be indicated by a vapor dominated mass flow rate into the borehole, at least for the present configuration. Sufficient air flow through the heated zone must be provided to contain the contaminants within the heated zone

  16. Numerical Analysis of Soil Settlement Prediction and Its Application In Large-Scale Marine Reclamation Artificial Island Project

    Directory of Open Access Journals (Sweden)

    Zhao Jie

    2017-11-01

    Full Text Available In an artificial island construction project based on the large-scale marine reclamation land, the soil settlement is a key to affect the late safe operation of the whole field. To analyze the factors of the soil settlement in a marine reclamation project, the SEM method in the soil micro-structural analysis method is used to test and study six soil samples such as the representative silt, mucky silty clay, silty clay and clay in the area. The structural characteristics that affect the soil settlement are obtained by observing the SEM charts at different depths. By combining numerical calculation method of Terzaghi’s one-dimensional and Biot’s two-dimensional consolidation theory, the one-dimensional and two-dimensional creep models are established and the numerical calculation results of two consolidation theories are compared in order to predict the maximum settlement of the soils 100 years after completion. The analysis results indicate that the micro-structural characteristics are the essential factor to affect the settlement in this area. Based on numerical analysis of one-dimensional and two-dimensional settlement, the settlement law and trend obtained by two numerical analysis method is similar. The analysis of this paper can provide reference and guidance to the project related to the marine reclamation land.

  17. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  18. NASA's Evolutionary Xenon Thruster (NEXT) Project Qualification Propellant Throughput Milestone: Performance, Erosion, and Thruster Service Life Prediction After 450 kg

    Science.gov (United States)

    Herman, Daniel A.

    2010-01-01

    The NASA s Evolutionary Xenon Thruster (NEXT) program is tasked with significantly improving and extending the capabilities of current state-of-the-art NSTAR thruster. The service life capability of the NEXT ion thruster is being assessed by thruster wear test and life-modeling of critical thruster components, such as the ion optics and cathodes. The NEXT Long-Duration Test (LDT) was initiated to validate and qualify the NEXT thruster propellant throughput capability. The NEXT thruster completed the primary goal of the LDT; namely to demonstrate the project qualification throughput of 450 kg by the end of calendar year 2009. The NEXT LDT has demonstrated 28,500 hr of operation and processed 466 kg of xenon throughput--more than double the throughput demonstrated by the NSTAR flight-spare. Thruster performance changes have been consistent with a priori predictions. Thruster erosion has been minimal and consistent with the thruster service life assessment, which predicts the first failure mode at greater than 750 kg throughput. The life-limiting failure mode for NEXT is predicted to be loss of structural integrity of the accelerator grid due to erosion by charge-exchange ions.

  19. Group Dynamics in Automatic Imitation.

    Science.gov (United States)

    Gleibs, Ilka H; Wilson, Neil; Reddy, Geetha; Catmur, Caroline

    Imitation-matching the configural body movements of another individual-plays a crucial part in social interaction. We investigated whether automatic imitation is not only influenced by who we imitate (ingroup vs. outgroup member) but also by the nature of an expected interaction situation (competitive vs. cooperative). In line with assumptions from Social Identity Theory), we predicted that both social group membership and the expected situation impact on the level of automatic imitation. We adopted a 2 (group membership target: ingroup, outgroup) x 2 (situation: cooperative, competitive) design. The dependent variable was the degree to which participants imitated the target in a reaction time automatic imitation task. 99 female students from two British Universities participated. We found a significant two-way interaction on the imitation effect. When interacting in expectation of cooperation, imitation was stronger for an ingroup target compared to an outgroup target. However, this was not the case in the competitive condition where imitation did not differ between ingroup and outgroup target. This demonstrates that the goal structure of an expected interaction will determine the extent to which intergroup relations influence imitation, supporting a social identity approach.

  20. On the Relationship Between Automatic Attitudes and Self-Reported Sexual Assault in Men

    Science.gov (United States)

    Widman, Laura; Olson, Michael

    2013-01-01

    Research and theory suggest rape supportive attitudes are important predictors of sexual assault; yet, to date, rape supportive attitudes have been assessed exclusively through self-report measures that are methodologically and theoretically limited. To address these limitations, the objectives of the current project were to: (1) develop a novel implicit rape attitude assessment that captures automatic attitudes about rape and does not rely on self-reports, and (2) examine the association between automatic rape attitudes and sexual assault perpetration. We predicted that automatic rape attitudes would be a significant unique predictor of sexual assault even when self-reported rape attitudes (i.e., rape myth acceptance and hostility toward women) were controlled. We tested the generalizability of this prediction in two independent samples: a sample of undergraduate college men (n = 75, M age = 19.3 years) and a sample of men from the community (n = 50, M age = 35.9 years). We found the novel implicit rape attitude assessment was significantly associated with the frequency of sexual assault perpetration in both samples and contributed unique variance in explaining sexual assault beyond rape myth acceptance and hostility toward women. We discuss the ways in which future research on automatic rape attitudes may significantly advance measurement and theory aimed at understanding and preventing sexual assault. PMID:22618119

  1. Prediction of optimal deployment projection for transcatheter aortic valve replacement: angiographic 3-dimensional reconstruction of the aortic root versus multidetector computed tomography.

    Science.gov (United States)

    Binder, Ronald K; Leipsic, Jonathon; Wood, David; Moore, Teri; Toggweiler, Stefan; Willson, Alex; Gurvitch, Ronen; Freeman, Melanie; Webb, John G

    2012-04-01

    Identifying the optimal fluoroscopic projection of the aortic valve is important for successful transcatheter aortic valve replacement (TAVR). Various imaging modalities, including multidetector computed tomography (MDCT), have been proposed for prediction of the optimal deployment projection. We evaluated a method that provides 3-dimensional angiographic reconstructions (3DA) of the aortic root for prediction of the optimal deployment angle and compared it with MDCT. Forty patients undergoing transfemoral TAVR at St Paul's Hospital, Vancouver, Canada, were evaluated. All underwent preimplant 3DA and 68% underwent preimplant MDCT. Three-dimensional angiographic reconstructions were generated from images of a C-arm rotational aortic root angiogram during breath-hold, rapid ventricular pacing, and injection of 32 mL contrast medium at 8 mL/s. Two independent operators prospectively predicted perpendicular valve projections. The implant angle was chosen at the discretion of the physician performing TAVR. The angles from 3DA, from MDCT, the implant angle, and the postdeployment perpendicular prosthesis view were compared. The shortest distance from the postdeployment perpendicular prosthesis projection to the regression line of predicted perpendicular projections was calculated. All but 1 patient had adequate image quality for reproducible angle predictions. There was a significant correlation between 3DA and MDCT for prediction of perpendicular valve projections (r=0.682, Pregression line of predicted angles to the postdeployment prosthesis view was 5.1±4.6° for 3DA and 7.9±4.9° for MDCT (P=0.01). Three-dimensional angiographic reconstructions and MDCT are safe, practical, and accurate imaging modalities for identifying the optimal perpendicular valve deployment projection during TAVR.

  2. Longitudinal and postural changes of blood pressure predict dementia: the Malmö Preventive Project.

    Science.gov (United States)

    Holm, Hannes; Nägga, Katarina; Nilsson, Erik D; Melander, Olle; Minthon, Lennart; Bachus, Erasmus; Fedorowski, Artur; Magnusson, Martin

    2017-04-01

    The role of blood pressure (BP) changes in dementia is debatable. We aimed to analyse how resting and postural BP changes relate to incident dementia over a long-term follow-up. In the prospective population-based Malmö Preventive Project, 18,240 study participants (mean age: 45 ± 7 years, 63% male) were examined between 1974 and 1992 with resting and standing BP measurement, and re-examined between 2002 and 2006 at mean age of 68 ± 6 years with resting BP. A total of 428 participants (2.3%) were diagnosed with dementia through Dec 31, 2009. The association of resting and postural BP changes with risk of dementia was studied using multivariable-adjusted Cox regression models controlling for traditional risk factors. Diastolic BP (DBP) decrease on standing indicated higher risk of dementia [Hazard ratio (HR) per 10 mmHg: 1.22; 95% confidence interval (CI) 1.01-1.44, p = 0.036], which was mainly driven by increased risk in normotensive individuals. Higher systolic (SBP) and diastolic BP at re-examination was associated with lower risk of dementia (HR per 10 mmHg: 0.94; 95% CI 0.89-0.99, p = 0.011; and 0.87; 0.78-0.96, p = 0.006, respectively). Extreme decrease in SBP/DBP between baseline and re-examination (4th quartile; -7 ± 12/-15 ± 7 mmHg, respectively) indicated higher risk of dementia (HR 1.46; 95% CI 1.11-1.93, p = 0.008, and 1.54; 95% CI 1.14-2.08, p = 0.005; respectively) compared with reference group characterised by pronounced BP increase over the same period (1st quartile; +44 ± 13/+15 ± 7 mmHg). Diastolic BP decrease on standing in the middle age, decline in BP between middle-and advanced age, and lower BP in advanced age are independent risk factors of developing dementia.

  3. The CHilean Automatic Supernova sEarch

    DEFF Research Database (Denmark)

    Hamuy, M.; Pignata, G.; Maza, J.

    2012-01-01

    The CHilean Automatic Supernova sEarch (CHASE) project began in 2007 with the goal to discover young, nearby southern supernovae in order to (1) better understand the physics of exploding stars and their progenitors, and (2) refine the methods to derive extragalactic distances. During the first...

  4. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.

    2009-01-01

    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009 http://dml.cz/handle/10338.dmlcz/140037

  5. PLC Based Automatic Multistoried Car Parking System

    OpenAIRE

    Swanand S .Vaze; Rohan S. Mithari

    2014-01-01

    This project work presents the study and design of PLC based Automatic Multistoried Car Parking System. Multistoried car parking is an arrangement which is used to park a large number of vehicles in least possible place. For making this arrangement in a real plan very high technological instruments are required. In this project a prototype of such a model is made. This prototype model is made for accommodating twelve cars at a time. Availability of the space for parking is detecte...

  6. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  7. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  8. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  9. Constructing Predictive Estimates for Worker Exposure to Radioactivity During Decommissioning: Analysis of Completed Decommissioning Projects - Master Thesis

    Energy Technology Data Exchange (ETDEWEB)

    Dettmers, Dana Lee; Eide, Steven Arvid

    2002-10-01

    An analysis of completed decommissioning projects is used to construct predictive estimates for worker exposure to radioactivity during decommissioning activities. The preferred organizational method for the completed decommissioning project data is to divide the data by type of facility, whether decommissioning was performed on part of the facility or the complete facility, and the level of radiation within the facility prior to decommissioning (low, medium, or high). Additional data analysis shows that there is not a downward trend in worker exposure data over time. Also, the use of a standard estimate for worker exposure to radioactivity may be a best estimate for low complete storage, high partial storage, and medium reactor facilities; a conservative estimate for some low level of facility radiation facilities (reactor complete, research complete, pits/ponds, other), medium partial process facilities, and high complete research facilities; and an underestimate for the remaining facilities. Limited data are available to compare different decommissioning alternatives, so the available data are reported and no conclusions can been drawn. It is recommended that all DOE sites and the NRC use a similar method to document worker hours, worker exposure to radiation (person-rem), and standard industrial accidents, injuries, and deaths for all completed decommissioning activities.

  10. Convergence on the Prediction of Ice Particle Mass and Projected Area in Ice Clouds

    Science.gov (United States)

    Mitchell, D. L.

    2013-12-01

    Ice particle mass- and area-dimensional power law (henceforth m-D and A-D) relationships are building-blocks for formulating microphysical processes and optical properties in cloud and climate models, and they are critical for ice cloud remote sensing algorithms, affecting the retrieval accuracy. They can be estimated by (1) directly measuring the sizes, masses and areas of individual ice particles at ground-level and (2) using aircraft probes to simultaneously measure the ice water content (IWC) and ice particle size distribution. A third indirect method is to use observations from method 1 to develop an m-A relationship representing mean conditions in ice clouds. Owing to a tighter correlation (relative to m-D data), this m-A relationship can be used to estimate m from aircraft probe measurements of A. This has the advantage of estimating m at small sizes, down to 10 μm using the 2D-Sterio probe. In this way, 2D-S measurements of maximum dimension D can be related to corresponding estimates of m to develop ice cloud type and temperature dependent m-D expressions. However, these expressions are no longer linear in log-log space, but are slowly varying curves covering most of the size range of natural ice particles. This work compares all three of the above methods and demonstrates close agreement between them. Regarding (1), 4869 ice particles and corresponding melted hemispheres were measured during a field campaign to obtain D and m. Selecting only those unrimed habits that formed between -20°C and -40°C, the mean mass values for selected size intervals are within 35% of the corresponding masses predicted by the Method 3 curve based on a similar temperature range. Moreover, the most recent m-D expression based on Method 2 differs by no more than 50% with the m-D curve from Method 3. Method 3 appears to be the most accurate over the observed ice particle size range (10-4000 μm). An m-D/A-D scheme was developed by which self-consistent m-D and A-D power laws

  11. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  12. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  13. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  14. IMPACT OF DIFFERENT TOPOGRAPHIC CORRECTIONS ON PREDICTION ACCURACY OF FOLIAGE PROJECTIVE COVER (FPC IN A TOPOGRAPHICALLY COMPLEX TERRAIN

    Directory of Open Access Journals (Sweden)

    S. Ediriweera

    2012-07-01

    Full Text Available Quantitative retrieval of land surface biological parameters (e.g. foliage projective cover [FPC] and Leaf Area Index is crucial for forest management, ecosystem modelling, and global change monitoring applications. Currently, remote sensing is a widely adopted method for rapid estimation of surface biological parameters in a landscape scale. Topographic correction is a necessary pre-processing step in the remote sensing application for topographically complex terrain. Selection of a suitable topographic correction method on remotely sensed spectral information is still an unresolved problem. The purpose of this study is to assess the impact of topographic corrections on the prediction of FPC in hilly terrain using an established regression model. Five established topographic corrections [C, Minnaert, SCS, SCS+C and processing scheme for standardised surface reflectance (PSSSR] were evaluated on Landsat TM5 acquired under low and high sun angles in closed canopied subtropical rainforest and eucalyptus dominated open canopied forest, north-eastern Australia. The effectiveness of methods at normalizing topographic influence, preserving biophysical spectral information, and internal data variability were assessed by statistical analysis and by comparing field collected FPC data. The results of statistical analyses show that SCS+C and PSSSR perform significantly better than other corrections, which were on less overcorrected areas of faintly illuminated slopes. However, the best relationship between FPC and Landsat spectral responses was obtained with the PSSSR by producing the least residual error. The SCS correction method was poor for correction of topographic effect in predicting FPC in topographically complex terrain.

  15. Applying a machine learning model using a locally preserving projection based feature regeneration algorithm to predict breast cancer risk

    Science.gov (United States)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qian, Wei; Zheng, Bin

    2018-03-01

    Both conventional and deep machine learning has been used to develop decision-support tools applied in medical imaging informatics. In order to take advantages of both conventional and deep learning approach, this study aims to investigate feasibility of applying a locally preserving projection (LPP) based feature regeneration algorithm to build a new machine learning classifier model to predict short-term breast cancer risk. First, a computer-aided image processing scheme was used to segment and quantify breast fibro-glandular tissue volume. Next, initially computed 44 image features related to the bilateral mammographic tissue density asymmetry were extracted. Then, an LLP-based feature combination method was applied to regenerate a new operational feature vector using a maximal variance approach. Last, a k-nearest neighborhood (KNN) algorithm based machine learning classifier using the LPP-generated new feature vectors was developed to predict breast cancer risk. A testing dataset involving negative mammograms acquired from 500 women was used. Among them, 250 were positive and 250 remained negative in the next subsequent mammography screening. Applying to this dataset, LLP-generated feature vector reduced the number of features from 44 to 4. Using a leave-onecase-out validation method, area under ROC curve produced by the KNN classifier significantly increased from 0.62 to 0.68 (p breast cancer detected in the next subsequent mammography screening.

  16. The eTOX Data-Sharing Project to Advance in Silico Drug-Induced Toxicity Prediction

    Directory of Open Access Journals (Sweden)

    Montserrat Cases

    2014-11-01

    Full Text Available The high-quality in vivo preclinical safety data produced by the pharmaceutical industry during drug development, which follows numerous strict guidelines, are mostly not available in the public domain. These safety data are sometimes published as a condensed summary for the few compounds that reach the market, but the majority of studies are never made public and are often difficult to access in an automated way, even sometimes within the owning company itself. It is evident from many academic and industrial examples, that useful data mining and model development requires large and representative data sets and careful curation of the collected data. In 2010, under the auspices of the Innovative Medicines Initiative, the eTOX project started with the objective of extracting and sharing preclinical study data from paper or pdf archives of toxicology departments of the 13 participating pharmaceutical companies and using such data for establishing a detailed, well-curated database, which could then serve as source for read-across approaches (early assessment of the potential toxicity of a drug candidate by comparison of similar structure and/or effects and training of predictive models. The paper describes the efforts undertaken to allow effective data sharing intellectual property (IP protection and set up of adequate controlled vocabularies and to establish the database (currently with over 4000 studies contributed by the pharma companies corresponding to more than 1400 compounds. In addition, the status of predictive models building and some specific features of the eTOX predictive system (eTOXsys are presented as decision support knowledge-based tools for drug development process at an early stage.

  17. Automatic Knowledge Extraction and Knowledge Structuring for a National Term Bank

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2011-01-01

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  18. A Game Theoretic Approach to Cyber Attack Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Peng Liu

    2005-11-28

    The area investigated by this project is cyber attack prediction. With a focus on correlation-based prediction, current attack prediction methodologies overlook the strategic nature of cyber attack-defense scenarios. As a result, current cyber attack prediction methodologies are very limited in predicting strategic behaviors of attackers in enforcing nontrivial cyber attacks such as DDoS attacks, and may result in low accuracy in correlation-based predictions. This project develops a game theoretic framework for cyber attack prediction, where an automatic game-theory-based attack prediction method is proposed. Being able to quantitatively predict the likelihood of (sequences of) attack actions, our attack prediction methodology can predict fine-grained strategic behaviors of attackers and may greatly improve the accuracy of correlation-based prediction. To our best knowledge, this project develops the first comprehensive framework for incentive-based modeling and inference of attack intent, objectives, and strategies; and this project develops the first method that can predict fine-grained strategic behaviors of attackers. The significance of this research and the benefit to the public can be demonstrated to certain extent by (a) the severe threat of cyber attacks to the critical infrastructures of the nation, including many infrastructures overseen by the Department of Energy, (b) the importance of cyber security to critical infrastructure protection, and (c) the importance of cyber attack prediction to achieving cyber security.

  19. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  20. Automatic map generalisation from research to production

    Science.gov (United States)

    Nyberg, Rose; Johansson, Mikael; Zhang, Yang

    2018-05-01

    The manual work of map generalisation is known to be a complex and time consuming task. With the development of technology and societies, the demands for more flexible map products with higher quality are growing. The Swedish mapping, cadastral and land registration authority Lantmäteriet has manual production lines for databases in five different scales, 1 : 10 000 (SE10), 1 : 50 000 (SE50), 1 : 100 000 (SE100), 1 : 250 000 (SE250) and 1 : 1 million (SE1M). To streamline this work, Lantmäteriet started a project to automatically generalise geographic information. Planned timespan for the project is 2015-2022. Below the project background together with the methods for the automatic generalisation are described. The paper is completed with a description of results and conclusions.

  1. Sparse encoding of automatic visual association in hippocampal networks

    DEFF Research Database (Denmark)

    Hulme, Oliver J; Skov, Martin; Chadwick, Martin J

    2014-01-01

    Intelligent action entails exploiting predictions about associations between elements of ones environment. The hippocampus and mediotemporal cortex are endowed with the network topology, physiology, and neurochemistry to automatically and sparsely code sensori-cognitive associations that can...

  2. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  3. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  4. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  5. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  6. Summary of ground motion prediction results for Nevada Test Site underground nuclear explosions related to the Yucca Mountain project

    International Nuclear Information System (INIS)

    Walck, M.C.

    1996-10-01

    This report summarizes available data on ground motions from underground nuclear explosions recorded on and near the Nevada Test Site, with emphasis on the ground motions recorded at stations on Yucca Mountain, the site of a potential high-level radioactive waste repository. Sandia National Laboratories, through the Weapons Test Seismic Investigations project, collected and analyzed ground motion data from NTS explosions over a 14-year period, from 1977 through 1990. By combining these data with available data from earlier, larger explosions, prediction equations for several ground motion parameters have been developed for the Test Site area for underground nuclear explosion sources. Also presented are available analyses of the relationship between surface and downhole motions and spectra and relevant crustal velocity structure information for Yucca Mountain derived from the explosion data. The data and associated analyses demonstrate that ground motions at Yucca Mountain from nuclear tests have been at levels lower than would be expected from moderate to large earthquakes in the region; thus nuclear explosions, while located relatively close, would not control seismic design criteria for the potential repository

  7. Summary of ground motion prediction results for Nevada Test Site underground nuclear explosions related to the Yucca Mountain project

    Energy Technology Data Exchange (ETDEWEB)

    Walck, M.C.

    1996-10-01

    This report summarizes available data on ground motions from underground nuclear explosions recorded on and near the Nevada Test Site, with emphasis on the ground motions recorded at stations on Yucca Mountain, the site of a potential high-level radioactive waste repository. Sandia National Laboratories, through the Weapons Test Seismic Investigations project, collected and analyzed ground motion data from NTS explosions over a 14-year period, from 1977 through 1990. By combining these data with available data from earlier, larger explosions, prediction equations for several ground motion parameters have been developed for the Test Site area for underground nuclear explosion sources. Also presented are available analyses of the relationship between surface and downhole motions and spectra and relevant crustal velocity structure information for Yucca Mountain derived from the explosion data. The data and associated analyses demonstrate that ground motions at Yucca Mountain from nuclear tests have been at levels lower than would be expected from moderate to large earthquakes in the region; thus nuclear explosions, while located relatively close, would not control seismic design criteria for the potential repository.

  8. The Ensembl genome database project.

    Science.gov (United States)

    Hubbard, T; Barker, D; Birney, E; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Huminiecki, L; Kasprzyk, A; Lehvaslaiho, H; Lijnzaad, P; Melsopp, C; Mongin, E; Pettett, R; Pocock, M; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Clamp, M

    2002-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of the human genome sequence, with confirmed gene predictions that have been integrated with external data sources, and is available as either an interactive web site or as flat files. It is also an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements from sequence analysis to data storage and visualisation. The Ensembl site is one of the leading sources of human genome sequence annotation and provided much of the analysis for publication by the international human genome project of the draft genome. The Ensembl system is being installed around the world in both companies and academic sites on machines ranging from supercomputers to laptops.

  9. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  10. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  11. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  12. Automatic sweep circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input is described. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found

  13. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  14. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  15. Fuels planning: science synthesis and integration; environmental consequences fact sheet 12: Water Erosion Prediction Project (WEPP) Fuel Management (FuMe) tool

    Science.gov (United States)

    William Elliot; David Hall

    2005-01-01

    The Water Erosion Prediction Project (WEPP) Fuel Management (FuMe) tool was developed to estimate sediment generated by fuel management activities. WEPP FuMe estimates sediment generated for 12 fuel-related conditions from a single input. This fact sheet identifies the intended users and uses, required inputs, what the model does, and tells the user how to obtain the...

  16. Autonomous Propellant Loading Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Autonomous Propellant Loading (APL) project consists of three activities. The first is to develop software that will automatically control loading of...

  17. Automatic blood detection in capsule endoscopy video

    Czech Academy of Sciences Publication Activity Database

    Novozámský, Adam; Flusser, Jan; Tachecí, I.; Sulík, L.; Bureš, J.; Krejcar, O.

    2016-01-01

    Roč. 21, č. 12 (2016), s. 1-8, č. článku 126007. ISSN 1083-3668 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Automatic blood detection * capsule endoscopy video Subject RIV: JD - Computer Applications, Robotics Impact factor: 2.530, year: 2016 http://library.utia.cas.cz/separaty/2016/ZOI/flusser-0466936.pdf

  18. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  19. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  20. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  1. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  2. Comparison of manual versus automatic continuous positive airway pressure titration and the development of a predictive equation for therapeutic continuous positive airway pressure in Chinese patients with obstructive sleep apnoea.

    Science.gov (United States)

    Luo, Jiaying; Xiao, Sichang; Qiu, Zhihui; Song, Ning; Luo, Yuanming

    2013-04-01

    Whether the therapeutic nasal continuous positive airway pressure (CPAP) derived from manual titration is the same as derived from automatic titration is controversial. The purpose of this study was to compare the therapeutic pressure derived from manual titration with automatic titration. Fifty-one patients with obstructive sleep apnoea (OSA) (mean apnoea/hypopnoea index (AHI) = 50.6 ± 18.6 events/h) who were newly diagnosed after an overnight full polysomnography and who were willing to accept CPAP as a long-term treatment were recruited for the study. Manual titration during full polysomnography monitoring and unattended automatic titration with an automatic CPAP device (REMstar Auto) were performed. A separate cohort study of one hundred patients with OSA (AHI = 54.3 ± 18.9 events/h) was also performed by observing the efficacy of CPAP derived from manual titration. The treatment pressure derived from automatic titration (9.8 ± 2.2 cmH(2)O) was significantly higher than that derived from manual titration (7.3 ± 1.5 cmH(2)O; P titration (54.3 ± 18.9 events/h before treatment and 3.3 ± 1.7 events/h after treatment; P titration pressure derived from REMstar Auto is usually higher than the pressure derived from manual titration. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  3. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  4. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  5. Characteristics and design improvement of AP1000 automatic depressurization system

    International Nuclear Information System (INIS)

    Jin Fei

    2012-01-01

    Automatic depressurization system, as a specialty of AP1000 Design, enhances capability of mitigating design basis accidents for plant. Advancement of the system is discussed by comparing with traditional PWR design and analyzing system functions, such as depressurizing and venting. System design improvement during China Project performance is also described. At the end, suggestions for the system in China Project are listed. (author)

  6. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Jones, J.P.

    1993-01-01

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  7. Genome3D: a UK collaborative project to annotate genomic sequences with predicted 3D structures based on SCOP and CATH domains.

    Science.gov (United States)

    Lewis, Tony E; Sillitoe, Ian; Andreeva, Antonina; Blundell, Tom L; Buchan, Daniel W A; Chothia, Cyrus; Cuff, Alison; Dana, Jose M; Filippis, Ioannis; Gough, Julian; Hunter, Sarah; Jones, David T; Kelley, Lawrence A; Kleywegt, Gerard J; Minneci, Federico; Mitchell, Alex; Murzin, Alexey G; Ochoa-Montaño, Bernardo; Rackham, Owen J L; Smith, James; Sternberg, Michael J E; Velankar, Sameer; Yeats, Corin; Orengo, Christine

    2013-01-01

    Genome3D, available at http://www.genome3d.eu, is a new collaborative project that integrates UK-based structural resources to provide a unique perspective on sequence-structure-function relationships. Leading structure prediction resources (DomSerf, FUGUE, Gene3D, pDomTHREADER, Phyre and SUPERFAMILY) provide annotations for UniProt sequences to indicate the locations of structural domains (structural annotations) and their 3D structures (structural models). Structural annotations and 3D model predictions are currently available for three model genomes (Homo sapiens, E. coli and baker's yeast), and the project will extend to other genomes in the near future. As these resources exploit different strategies for predicting structures, the main aim of Genome3D is to enable comparisons between all the resources so that biologists can see where predictions agree and are therefore more trusted. Furthermore, as these methods differ in whether they build their predictions using CATH or SCOP, Genome3D also contains the first official mapping between these two databases. This has identified pairs of similar superfamilies from the two resources at various degrees of consensus (532 bronze pairs, 527 silver pairs and 370 gold pairs).

  8. Investigation of an automatic trim algorithm for restructurable aircraft control

    Science.gov (United States)

    Weiss, J.; Eterno, J.; Grunberg, D.; Looze, D.; Ostroff, A.

    1986-01-01

    This paper develops and solves an automatic trim problem for restructurable aircraft control. The trim solution is applied as a feed-forward control to reject measurable disturbances following control element failures. Disturbance rejection and command following performances are recovered through the automatic feedback control redesign procedure described by Looze et al. (1985). For this project the existence of a failure detection mechanism is assumed, and methods to cope with potential detection and identification inaccuracies are addressed.

  9. Experiences in automatic keywording of particle physics literature

    CERN Document Server

    Montejo Ráez, Arturo

    2001-01-01

    Attributing keywords can assist in the classification and retrieval of documents in the particle physics literature. As information services face a future with less available manpower and more and more documents being written, the possibility of keyword attribution being assisted by automatic classification software is explored. A project being carried out at CERN (the European Laboratory for Particle Physics) for the development and integration of automatic keywording is described.

  10. Automatic annotation of protein motif function with Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Vanathi

    2004-09-01

    Full Text Available Abstract Background Conserved protein sequence motifs are short stretches of amino acid sequence patterns that potentially encode the function of proteins. Several sequence pattern searching algorithms and programs exist foridentifying candidate protein motifs at the whole genome level. However, amuch needed and importanttask is to determine the functions of the newly identified protein motifs. The Gene Ontology (GO project is an endeavor to annotate the function of genes or protein sequences with terms from a dynamic, controlled vocabulary and these annotations serve well as a knowledge base. Results This paperpresents methods to mine the GO knowledge base and use the association between the GO terms assigned to a sequence and the motifs matched by the same sequence as evidence for predicting the functions of novel protein motifs automatically. The task of assigning GO terms to protein motifsis viewed as both a binary classification and information retrieval problem, where PROSITE motifs are used as samples for mode training and functional prediction. The mutual information of a motif and aGO term association isfound to be a very useful feature. We take advantageof the known motifs to train a logistic regression classifier, which allows us to combine mutual information with other frequency-based features and obtain a probability of correctassociation. The trained logistic regression model has intuitively meaningful and logically plausible parameter values, and performs very well empirically according to our evaluation criteria. Conclusions In this research, different methods for automatic annotation of protein motifs have been investigated. Empirical result demonstrated that the methods have a great potential for detecting and augmenting information about thefunctions of newly discovered candidate protein motifs.

  11. The DanTermBank Project

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Pram Nielsen, Louise

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  12. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Directory of Open Access Journals (Sweden)

    Emily E Butler

    Full Text Available Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243, the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness and with disorders of social cognition (autistic-like and schizotypal traits. Further personality traits (narcissism and empathy were assessed in a subsample of participants (N=57. Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  13. Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.

    Science.gov (United States)

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2015-01-01

    Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243), the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness) and with disorders of social cognition (autistic-like and schizotypal traits). Further personality traits (narcissism and empathy) were assessed in a subsample of participants (N=57). Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.

  14. Exposure to violent video games increases automatic aggressiveness.

    Science.gov (United States)

    Uhlmann, Eric; Swanson, Jane

    2004-02-01

    The effects of exposure to violent video games on automatic associations with the self were investigated in a sample of 121 students. Playing the violent video game Doom led participants to associate themselves with aggressive traits and actions on the Implicit Association Test. In addition, self-reported prior exposure to violent video games predicted automatic aggressive self-concept, above and beyond self-reported aggression. Results suggest that playing violent video games can lead to the automatic learning of aggressive self-views.

  15. The Meta-Analysis of Clinical Judgment Project: Fifty-Six Years of Accumulated Research on Clinical Versus Statistical Prediction

    Science.gov (United States)

    Aegisdottir, Stefania; White, Michael J.; Spengler, Paul M.; Maugherman, Alan S.; Anderson, Linda A.; Cook, Robert S.; Nichols, Cassandra N.; Lampropoulos, Georgios K.; Walker, Blain S.; Cohen, Genna; Rush, Jeffrey D.

    2006-01-01

    Clinical predictions made by mental health practitioners are compared with those using statistical approaches. Sixty-seven studies were identified from a comprehensive search of 56 years of research; 92 effect sizes were derived from these studies. The overall effect of clinical versus statistical prediction showed a somewhat greater accuracy for…

  16. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  17. AUTOMATIC FREQUENCY CONTROL SYSTEM

    Science.gov (United States)

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  18. Automatic dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.

    2008-01-01

    The Catani-Seymour dipole subtraction is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. We automatized the procedure in a computer code. The code is useful especially for the processes with many parton legs. In this talk, we first explain the algorithm of the dipole subtraction and the whole structure of our code. After that we show the results for some processes where the infrared divergences of real emission processes are subtracted. (author)

  19. Automatic programmable air ozonizer

    International Nuclear Information System (INIS)

    Gubarev, S.P.; Klosovsky, A.V.; Opaleva, G.P.; Taran, V.S.; Zolototrubova, M.I.

    2015-01-01

    In this paper we describe a compact, economical, easy to manage auto air ozonator developed at the Institute of Plasma Physics of the NSC KIPT. It is designed for sanitation, disinfection of premises and cleaning the air from foreign odors. A distinctive feature of the developed device is the generation of a given concentration of ozone, approximately 0.7 maximum allowable concentration (MAC), and automatic maintenance of a specified level. This allows people to be inside the processed premises during operation. The microprocessor controller to control the operation of the ozonator was developed

  20. Solent Disturbance and Mitigation Project Phase II: Predicting the impact of human disturbance on overwintering birds in the Solent.

    OpenAIRE

    Stillman, Richard A.; West, Andrew D.; Clarke, Ralph T.; Liley, D.

    2012-01-01

    The Solent coastline provides feeding grounds for internationally protected populations of overwintering waders and wildfowl, and is also extensively used for recreation. In response to concerns over the impact of recreational pressure on birds within protected areas in the Solent, the Solent Forum initiated the Solent Disturbance and Mitigation Project to determine visitor access patterns around the coast and how their activities may influence the birds. The project has been divided into two...

  1. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  2. Piloted Simulation of a Model-Predictive Automated Recovery System

    Science.gov (United States)

    Liu, James (Yuan); Litt, Jonathan; Sowers, T. Shane; Owens, A. Karl; Guo, Ten-Huei

    2014-01-01

    This presentation describes a model-predictive automatic recovery system for aircraft on the verge of a loss-of-control situation. The system determines when it must intervene to prevent an imminent accident, resulting from a poor approach. It estimates the altitude loss that would result from a go-around maneuver at the current flight condition. If the loss is projected to violate a minimum altitude threshold, the maneuver is automatically triggered. The system deactivates to allow landing once several criteria are met. Piloted flight simulator evaluation showed the system to provide effective envelope protection during extremely unsafe landing attempts. The results demonstrate how flight and propulsion control can be integrated to recover control of the vehicle automatically and prevent a potential catastrophe.

  3. Automatic document navigation for digital content remastering

    Science.gov (United States)

    Lin, Xiaofan; Simske, Steven J.

    2003-12-01

    This paper presents a novel method of automatically adding navigation capabilities to re-mastered electronic books. We first analyze the need for a generic and robust system to automatically construct navigation links into re-mastered books. We then introduce the core algorithm based on text matching for building the links. The proposed method utilizes the tree-structured dictionary and directional graph of the table of contents to efficiently conduct the text matching. Information fusion further increases the robustness of the algorithm. The experimental results on the MIT Press digital library project are discussed and the key functional features of the system are illustrated. We have also investigated how the quality of the OCR engine affects the linking algorithm. In addition, the analogy between this work and Web link mining has been pointed out.

  4. Intercomparison of Different Energy Prediction Methods Within the European Project "Performance" - Results of the 1st Round Robin

    NARCIS (Netherlands)

    Friesen, G.; Gottschalg, R.; Beyer, H.G.; Williams, S.R.; van Sark, W.G.J.H.M.; Guérin de Montgareuil, A.; van der Borg, N; Huld, T.; Müller, B.; de Keizer, A.C.; Niu, Y.

    2007-01-01

    Eight separate energy prediction methods, developed independently across European Universities and Research Centres, have been compared with respect to their estimated DC energy generation for five different photovoltaic (PV) module technologies and 7 different sites distributed over whole Europe.

  5. Next generation paradigm for urban pluvial flood modelling, prediction, management and vulnerability reduction - Interaction between RainGain and Blue Green Dream projects

    Science.gov (United States)

    Maksimovic, C.

    2012-04-01

    The effects of climate change and increasing urbanisation call for a new paradigm for efficient planning, management and retrofitting of urban developments to increase resilience to climate change and to maximize ecosystem services. Improved management of urban floods from all sources in required. Time scale for well documented fluvial and coastal floods allows for timely response but surface (pluvial) flooding caused by intense local storms had not been given appropriate attention, Pitt Review (UK). Urban surface floods predictions require fine scale data and model resolutions. They have to be tackled locally by combining central inputs (meteorological services) with the efforts of the local entities. Although significant breakthrough in modelling of pluvial flooding was made there is a need to further enhance short term prediction of both rainfall and surface flooding. These issues are dealt with in the EU Iterreg project Rain Gain (RG). Breakthrough in urban flood mitigation can only be achieved by combined effects of advanced planning design, construction and management of urban water (blue) assets in interaction with urban vegetated areas' (green) assets. Changes in design and operation of blue and green assets, currently operating as two separate systems, is urgently required. Gaps in knowledge and technology will be introduced by EIT's Climate-KIC Blue Green Dream (BGD) project. The RG and BGD projects provide synergy of the "decoupled" blue and green systems to enhance multiple benefits to: urban amenity, flood management, heat island, biodiversity, resilience to drought thus energy requirements, thus increased quality of urban life at lower costs. Urban pluvial flood management will address two priority areas: Short Term rainfall Forecast and Short term flood surface forecast. Spatial resolution of short term rainfall forecast below 0.5 km2 and lead time of a few hours are needed. Improvements are achievable by combining data sources of raingauge networks

  6. EBFA project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    An engineering project office was established during the fall of 1976 to manage and coordinate all of the activities of the Electron Beam Fusion Project. The goal of the project is to develop the Electron Beam Fusion Accelerator (EBFA) and its supporting systems, and integrate these systems into the new Electron Beam Fusion Facility (EBFF). Supporting systems for EBFA include a control/monitor system, a data acquistion/automatic data processing system, the liquid transfer systems, the insulating gas transfer systems, etc. Engineers and technicians were assigned to the project office to carry out the engineering design, initiate procurement, monitor the fabrication, perform the assembly and to assist the pulsed power research group in the activation of the EBFA

  7. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  8. Colour transformations and K-means segmentation for automatic cloud detection

    Directory of Open Access Journals (Sweden)

    Martin Blazek

    2015-08-01

    Full Text Available The main aim of this work is to find simple criteria for automatic recognition of several meteorological phenomena using optical digital sensors (e.g., Wide-Field cameras, automatic DSLR cameras or robotic telescopes. The output of those sensors is commonly represented in RGB channels containing information about both colour and luminosity even when normalised. Transformation into other colour spaces (e.g., CIE 1931 xyz, CIE L*a*b*, YCbCr can separate colour from luminosity, which is especially useful in the image processing of automatic cloud boundary recognition. Different colour transformations provide different sectorization of cloudy images. Hence, the analysed meteorological phenomena (cloud types, clear sky project differently into the colour diagrams of each international colour systems. In such diagrams, statistical tools can be applied in search of criteria which could determine clear sky from a covered one and possibly even perform a meteorological classification of cloud types. For the purpose of this work, a database of sky images (both clear and cloudy, with emphasis on a variety of different observation conditions (e.g., time, altitude, solar angle, etc. was acquired. The effectiveness of several colour transformations for meteorological application is discussed and the representation of different clouds (or clear sky in those colour systems is analysed. Utilisation of this algorithm would be useful in all-sky surveys, supplementary meteorological observations, solar cell effectiveness predictions or daytime astronomical solar observations.

  9. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  10. Automatic identification in mining

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, D; Patrick, C [Mine Computers and Electronics Inc., Morehead, KY (United States)

    1998-06-01

    The feasibility of monitoring the locations and vital statistics of equipment and personnel in surface and underground mining operations has increased with advancements in radio frequency identification (RFID) technology. This paper addresses the use of RFID technology, which is relatively new to the mining industry, to track surface equipment in mine pits, loading points and processing facilities. Specific applications are discussed, including both simplified and complex truck tracking systems and an automatic pit ticket system. This paper concludes with a discussion of the future possibilities of using RFID technology in mining including monitoring heart and respiration rates, body temperatures and exertion levels; monitoring repetitious movements for the study of work habits; and logging air quality via personnel sensors. 10 refs., 5 figs.

  11. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  12. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  13. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  14. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  15. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  16. Bayesian spatial prediction of the site index in the study of the Missouri Ozark Forest Ecosystem Project

    Science.gov (United States)

    Xiaoqian Sun; Zhuoqiong He; John Kabrick

    2008-01-01

    This paper presents a Bayesian spatial method for analysing the site index data from the Missouri Ozark Forest Ecosystem Project (MOFEP). Based on ecological background and availability, we select three variables, the aspect class, the soil depth and the land type association as covariates for analysis. To allow great flexibility of the smoothness of the random field,...

  17. Detecting accuracy of flaws by manual and automatic ultrasonic inspections

    International Nuclear Information System (INIS)

    Iida, K.

    1988-01-01

    As the final stage work in the nine year project on proving tests of the ultrasonic inspection technique applied to the ISI of LWR plants, automatic ultrasonic inspection tests were carried out on EDM notches, surface fatigue cracks, weld defects and stress corrosion cracks, which were deliberately introduced in full size structural components simulating a 1,100 MWe BWR. Investigated items are the performance of a newly assembled automatic inspection apparatus, detection limit of flaws, detection resolution of adjacent collinear or parallel EDM notches, detection reproducibility and detection accuracy. The manual ultrasonic inspection of the same flaws as inspected by the automatic ultrasonic inspection was also carried out in order to have comparative data. This paper reports how it was confirmed that the automatic ultrasonic inspection is much superior to the manual inspection in the flaw detection rate and in the detection reproducibility

  18. Predictors of Mental Health Symptoms, Automatic Thoughts, and Self-Esteem Among University Students.

    Science.gov (United States)

    Hiçdurmaz, Duygu; İnci, Figen; Karahan, Sevilay

    2017-01-01

    University youth is a risk group regarding mental health, and many mental health problems are frequent in this group. Sociodemographic factors such as level of income and familial factors such as relationship with father are reported to be associated with mental health symptoms, automatic thoughts, and self-esteem. Also, there are interrelations between mental health problems, automatic thoughts, and self-esteem. The extent of predictive effect of each of these variables on automatic thoughts, self-esteem, and mental health symptoms is not known. We aimed to determine the predictive factors of mental health symptoms, automatic thoughts, and self-esteem in university students. Participants were 530 students enrolled at a university in Turkey, during 2014-2015 academic year. Data were collected using the student information form, the Brief Symptom Inventory, the Automatic Thoughts Questionnaire, and the Rosenberg Self-Esteem Scale. Mental health symptoms, self-esteem, perception of the relationship with the father, and level of income as a student significantly predicted automatic thoughts. Automatic thoughts, mental health symptoms, participation in family decisions, and age had significant predictive effects on self-esteem. Finally, automatic thoughts, self-esteem, age, and perception of the relationship with the father had significant predictive effects on mental health symptoms. The predictive factors revealed in our study provide important information to practitioners and researchers by showing the elements that need to be screened for mental health of university students and issues that need to be included in counseling activities.

  19. Interaction between serotonin transporter gene variants and life events predicts response to antidepressants in the GENDEP project

    DEFF Research Database (Denmark)

    Keers, R.; Uher, R.; Huezo-Diaz, P.

    2011-01-01

    , and several polymorphisms in the serotonin transporter gene (SLC6A4) have been genotyped including the serotonin transporter-linked polymorphic region (5-HTTLPR). Stressful life events were shown to predict a significantly better response to escitalopram but had no effect on response to nortriptyline...

  20. Automatic assignment of prokaryotic genes to functional categories using literature profiling.

    Directory of Open Access Journals (Sweden)

    Raul Torrieri

    Full Text Available In the last years, there was an exponential increase in the number of publicly available genomes. Once finished, most genome projects lack financial support to review annotations. A few of these gene annotations are based on a combination of bioinformatics evidence, however, in most cases, annotations are based solely on sequence similarity to a previously known gene, which was most probably annotated in the same way. As a result, a large number of predicted genes remain unassigned to any functional category despite the fact that there is enough evidence in the literature to predict their function. We developed a classifier trained with term-frequency vectors automatically disclosed from text corpora of an ensemble of genes representative of each functional category of the J. Craig Venter Institute Comprehensive Microbial Resource (JCVI-CMR ontology. The classifier achieved up to 84% precision with 68% recall (for confidence≥0.4, F-measure 0.76 (recall and precision equally weighted in an independent set of 2,220 genes, from 13 bacterial species, previously classified by JCVI-CMR into unambiguous categories of its ontology. Finally, the classifier assigned (confidence≥0.7 to functional categories a total of 5,235 out of the ∼24 thousand genes previously in categories "Unknown function" or "Unclassified" for which there is literature in MEDLINE. Two biologists reviewed the literature of 100 of these genes, randomly picket, and assigned them to the same functional categories predicted by the automatic classifier. Our results confirmed the hypothesis that it is possible to confidently assign genes of a real world repository to functional categories, based exclusively on the automatic profiling of its associated literature. The LitProf--Gene Classifier web server is accessible at: www.cebio.org/litprofGC.

  1. GRECOS Project (Genotyping Recurrence Risk of Stroke): The Use of Genetics to Predict the Vascular Recurrence After Stroke.

    Science.gov (United States)

    Fernández-Cadenas, Israel; Mendióroz, Maite; Giralt, Dolors; Nafria, Cristina; Garcia, Elena; Carrera, Caty; Gallego-Fabrega, Cristina; Domingues-Montanari, Sophie; Delgado, Pilar; Ribó, Marc; Castellanos, Mar; Martínez, Sergi; Freijo, Marimar; Jiménez-Conde, Jordi; Rubiera, Marta; Alvarez-Sabín, José; Molina, Carlos A; Font, Maria Angels; Grau Olivares, Marta; Palomeras, Ernest; Perez de la Ossa, Natalia; Martinez-Zabaleta, Maite; Masjuan, Jaime; Moniche, Francisco; Canovas, David; Piñana, Carlos; Purroy, Francisco; Cocho, Dolores; Navas, Inma; Tejero, Carlos; Aymerich, Nuria; Cullell, Natalia; Muiño, Elena; Serena, Joaquín; Rubio, Francisco; Davalos, Antoni; Roquer, Jaume; Arenillas, Juan Francisco; Martí-Fábregas, Joan; Keene, Keith; Chen, Wei-Min; Worrall, Bradford; Sale, Michele; Arboix, Adrià; Krupinski, Jerzy; Montaner, Joan

    2017-05-01

    Vascular recurrence occurs in 11% of patients during the first year after ischemic stroke (IS) or transient ischemic attack. Clinical scores do not predict the whole vascular recurrence risk; therefore, we aimed to find genetic variants associated with recurrence that might improve the clinical predictive models in IS. We analyzed 256 polymorphisms from 115 candidate genes in 3 patient cohorts comprising 4482 IS or transient ischemic attack patients. The discovery cohort was prospectively recruited and included 1494 patients, 6.2% of them developed a new IS during the first year of follow-up. Replication analysis was performed in 2988 patients using SNPlex or HumanOmni1-Quad technology. We generated a predictive model using Cox regression (GRECOS score [Genotyping Reurrence Risk of Stroke]) and generated risk groups using a classification tree method. The analyses revealed that rs1800801 in the MGP gene (hazard ratio, 1.33; P =9×10 - 03 ), a gene related to artery calcification, was associated with new IS during the first year of follow-up. This polymorphism was replicated in a Spanish cohort (n=1.305); however, it was not significantly associated in a North American cohort (n=1.683). The GRECOS score predicted new IS ( P =3.2×10 - 09 ) and could classify patients, from low risk of stroke recurrence (1.9%) to high risk (12.6%). Moreover, the addition of genetic risk factors to the GRECOS score improves the prediction compared with previous Stroke Prognosis Instrument-II score ( P =0.03). The use of genetics could be useful to estimate vascular recurrence risk after IS. Genetic variability in the MGP gene was associated with vascular recurrence in the Spanish population. © 2017 American Heart Association, Inc.

  2. Testing projected wild bee distributions in agricultural habitats: predictive power depends on species traits and habitat type.

    Science.gov (United States)

    Marshall, Leon; Carvalheiro, Luísa G; Aguirre-Gutiérrez, Jesús; Bos, Merijn; de Groot, G Arjen; Kleijn, David; Potts, Simon G; Reemer, Menno; Roberts, Stuart; Scheper, Jeroen; Biesmeijer, Jacobus C

    2015-10-01

    Species distribution models (SDM) are increasingly used to understand the factors that regulate variation in biodiversity patterns and to help plan conservation strategies. However, these models are rarely validated with independently collected data and it is unclear whether SDM performance is maintained across distinct habitats and for species with different functional traits. Highly mobile species, such as bees, can be particularly challenging to model. Here, we use independent sets of occurrence data collected systematically in several agricultural habitats to test how the predictive performance of SDMs for wild bee species depends on species traits, habitat type, and sampling technique. We used a species distribution modeling approach parametrized for the Netherlands, with presence records from 1990 to 2010 for 193 Dutch wild bees. For each species, we built a Maxent model based on 13 climate and landscape variables. We tested the predictive performance of the SDMs with independent datasets collected from orchards and arable fields across the Netherlands from 2010 to 2013, using transect surveys or pan traps. Model predictive performance depended on species traits and habitat type. Occurrence of bee species specialized in habitat and diet was better predicted than generalist bees. Predictions of habitat suitability were also more precise for habitats that are temporally more stable (orchards) than for habitats that suffer regular alterations (arable), particularly for small, solitary bees. As a conservation tool, SDMs are best suited to modeling rarer, specialist species than more generalist and will work best in long-term stable habitats. The variability of complex, short-term habitats is difficult to capture in such models and historical land use generally has low thematic resolution. To improve SDMs' usefulness, models require explanatory variables and collection data that include detailed landscape characteristics, for example, variability of crops and

  3. The CI-Flow Project: A System for Total Water Level Prediction from the Summit to the Sea

    Science.gov (United States)

    2011-11-01

    round and may be applied to all types of coastal storms , including intense cool- season extratropical cyclones (i.e., nor’easters). In addition...associated with waves, tides, storm surge, rivers, and rainfall, including interactions at the tidal/surge interface Within this project, Cl-FLOW addresses...presented for Hurricane Isabel (2003), Hurricane Earl (20I0), and Tropical Storm Nicole (2010) for the Tar -Pamlico and Neuse River basins of North

  4. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    Science.gov (United States)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  5. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  6. Predicting the Extent of Inundation due to Sea-Level Rise: Al Hamra Development, Ras Al Khaimah, UAE. A Pilot Project

    Directory of Open Access Journals (Sweden)

    Arthur Robert M.

    2016-06-01

    Full Text Available As new information is received, predictions of sea-level rise resulting from global warming continue to be revised upwards. Measurements indicate that the rise in sea-level is continuing at, or close to, the worst case forecasts (Kellet et al. 2014. Coastal areas are coming under increasing risk of inundation and flooding as storms are predicted to increase in frequency and severity, adding to the risk of inundation due to higher sea levels. Stakeholders, government agencies, developers and land owners require accurate, up to date information to be able to protect coastal areas. Geographic Information Systems (GIS along with accurate remote sensing technologies such as LiDAR provides the best means for delivering this information. Using these technologies, this paper predicts the risk posed to a large multi-use development in the emirate of Ras Al Khaimah, UAE. This development, Al Hamra Village, is situated on the coast of the Arabian Gulf. Al Hamra’s physical relationship to the Gulf is in common with other developments in Ras Al Khaimah in its and for this reason has been used as a pilot project. The resulting GIS model shows that Al Hamra is indeed at risk from predicted flood events. How this information can be used as a planning tool for numerous strategies is discussed in this paper.

  7. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    Science.gov (United States)

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  8. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  9. AUTOMATIC LIGHT CONTROL

    Science.gov (United States)

    Artzt, M.

    1957-08-27

    A control system for a projection kinescope used in a facsimile scanning system and, in particular, meams for maintaining substantially constant the light emanating from the flying spot on the face of the kinescope are described. In general, the invention provides a feeler member disposed in such a position with respect to a projecting lens as to intercept a portion of the light striking the lens. Suitable circuitry in conjunction with a photomultiplier tube provides a signal proportional to the light intensity of the flying spot. The grid bias on the kinescope is controlled by this signal to maintain the intensity of the spot substantially constant.

  10. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  11. The ChemScreen project to design a pragmatic alternative approach to predict reproductive toxicity of chemicals

    DEFF Research Database (Denmark)

    van der Burg, Bart; Wedebye, Eva Bay; Dietrich, Daniel R.

    2015-01-01

    to validate the test panel using mechanistic approaches. We are actively engaged in promoting regulatory acceptance of the tools developed as an essential step towards practical application, including case studies for read-across purposes. With this approach, a significant saving in animal use and associated......There is a great need for rapid testing strategies for reproductive toxicity testing, avoiding animal use. The EU Framework program 7 project ChemScreen aimed to fill this gap in a pragmatic manner preferably using validated existing tools and place them in an innovative alternative testing...

  12. Operational strategy for soil concentration predictions of strontium/yttrium-90 and cesium-137 in surface soil at the West Valley Demonstration Project site

    International Nuclear Information System (INIS)

    Myers, J.A.

    1995-01-01

    There are difficulties associated with the assessment of the interpretation of field measurements, determination of guideline protocols and control and disposal of low level radioactive contaminated soil in the environmental health physics field. Questions are raised among scientists and in public forums concerning the necessity and high costs of large area soil remediation versus the risks of low-dose radiation health effects. As a result, accurate soil activity assessments become imperative in decontamination situations. The West Valley Demonstration Project (WVDP), a US Department of Energy facility located in West Valley, New York is managed and operated by West Valley Nuclear Services Co., Inc. (WVNS). WVNS has identified contaminated on-site soil areas with a mixed variety of radionuclides (primarily fission product). Through the use of data obtained from a previous project performed during the summer of 1994 entitled ''Field Survey Correlation and Instrumentation Response for an In Situ Soil Measurement Program'' (Myers), the WVDP offers a unique research opportunity to investigate the possibility of soil concentration predictions based on exposure or count rate responses returned from a survey detector probe. In this study, correlations are developed between laboratory measured soil beta activity and survey probe response for the purposes of determining the optimal detector for field use and using these correlations to establish predictability of soil activity levels

  13. Predictions of tracer transport in interwell tracer tests at the C-Hole complex. Yucca Mountain site characterization project report milestone 4077

    International Nuclear Information System (INIS)

    Reimus, P.W.

    1996-09-01

    This report presents predictions of tracer transport in interwell tracer tests that are to be conducted at the C-Hole complex at the Nevada Test Site on behalf of the Yucca Mountain Site Characterization Project. The predictions are used to make specific recommendations about the manner in which the tracer test should be conducted to best satisfy the needs of the Project. The objective of he tracer tests is to study flow and species transport under saturated conditions in the fractured tuffs near Yucca Mountain, Nevada, the site of a potential high-level nuclear waste repository. The potential repository will be located in the unsaturated zone within Yucca Mountain. The saturated zone beneath and around the mountain represents the final barrier to transport to the accessible environment that radionuclides will encounter if they breach the engineered barriers within the repository and the barriers to flow and transport provided by the unsaturated zone. Background information on the C-Holes is provided in Section 1.1, and the planned tracer testing program is discussed in Section 1.2

  14. A Network of Automatic Control Web-Based Laboratories

    Science.gov (United States)

    Vargas, Hector; Sanchez Moreno, J.; Jara, Carlos A.; Candelas, F. A.; Torres, Fernando; Dormido, Sebastian

    2011-01-01

    This article presents an innovative project in the context of remote experimentation applied to control engineering education. Specifically, the authors describe their experience regarding the analysis, design, development, and exploitation of web-based technologies within the scope of automatic control. This work is part of an inter-university…

  15. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database...

  16. Design and construction of an automatic texture goniometer

    International Nuclear Information System (INIS)

    Lima, N.B. de; Pontes, E.W.; Monteiro, P.R.B.; Imakuma, K.

    1984-01-01

    The project and construction of a two-axis automatic goniometer, operated by step motors, adaptable to the scanning goniometer SG-7 or SG-8 fabricated by Rigaku-Deuki. To allow the operation of this texture goniometer, computer codes has been developed. (E.G.) [pt

  17. Automatic content linking: Speech-based just-in-time retrieval for multimedia archives

    NARCIS (Netherlands)

    Popescu-Belis, A.; Kilgour, J.; Poller, P.; Nanchen, A.; Boertjes, E.; Wit, J. de

    2010-01-01

    The Automatic Content Linking Device monitors a conversation and uses automatically recognized words to retrieve documents that are of potential use to the participants. The document set includes project related reports or emails, transcribed snippets of past meetings, and websites. Retrieval

  18. Automatic detection of laughter

    NARCIS (Netherlands)

    Truong, K.P.; Leeuwen, D.A. van

    2005-01-01

    In the context of detecting ‘paralinguistic events’ with the aim to make classification of the speaker’s emotional state possible, a detector was developed for one of the most obvious ‘paralinguistic events’, namely laughter. Gaussian Mixture Models were trained with Perceptual Linear Prediction

  19. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  20. Prediction of the Effects of Radiation FOr Reactor pressure vessel and in-core Materials using multi-scale modeling - 60 years foreseen plant lifetime (PERFORM-60 project)

    International Nuclear Information System (INIS)

    Al Mazouzi, A.; Bugat, S.; Leclercq, S.; Massoud, J.-P.; Moinereau, D.; Lidbury, D.; Van Dyck, S.; Marini, B.; Alamo, Ana

    2010-01-01

    The PERFECT project of the EURATOM framework program (FP6) is a first step through the development of a simulation platform that contains several advanced numerical tools aiming at the prediction of irradiation damage in both the reactor pressure vessel (RPV) and its internals using state-of-the-art knowledge. These tools allow simulation of irradiation effects on the microstructure and the constitutive behavior of the RPV low alloy steels, as well as their fracture mechanics properties. For the reactor internals, the first partial models were established, describing radiation damage to the microstructure and providing a first description of the stress corrosion behaviour of austenitic steels in primary environment, without physical linking of the radiation and corrosion effects. Thus, relying on the existing PERFECT Roadmap, the FP7 Collaborative Project PERFORM 60 has mainly for objective to develop similar tools that would allow the simulation of the combined effects of irradiation and corrosion on internals, in addition to a further improvement of the existing ones on RPV made of bainitic steels. From the managerial view point, PERFORM 60 is based on two technical sub-projects, namely (i) RPV and (ii) Internals. In addition, a Users' Group and a training scheme have been adopted in order to allow representatives of constructors, utilities, research organizations... from Europe, USA and Japan to participate actively in the process of appraising the limits and potentialities of the developed tools as well as their validation against qualified experimental data

  1. TMB: Automatic Differentiation and Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Kasper Kristensen

    2016-04-01

    Full Text Available TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011. In addition, it offers easy access to parallel computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three of the joint likelihood. The computations are designed to be fast for problems with many random effects (≈ 106 and parameters (≈ 103 . Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems. The package and examples are available at http://tmb-project.org/.

  2. Growth rates and specific motor abilities as a function to predict the selection of talents taekwondo sport (Egyptian national project

    Directory of Open Access Journals (Sweden)

    Mohammed Mustafa Bakr

    2016-07-01

    Full Text Available The aim of this study is to investigate the contribution ratios of Growth rates and specific motor abilities as a function to predict the selection of talented taekwondo sport. The study was carried out on a sample of (755 individual Clubs and youth centers across the governorates of Egypt, and the average age (11.64 ± 0.48 years, height (144.06 ± 7.04 cm and weight (36.86 ± 7.51 kg. Tests were conducted in the period from 7/11/2011 to 29/12/2011 selected individuals underwent the following tests and measurements (Ability, Hinge flexibility basin, Agility, Kinetics speed in level trunk, Kinetics speed in level face, Endure Performance, Performance , the researcher used the descriptive survey method. The statistical analysis SPSS was used to apply formulas statistical by calculating: average, standard deviation, correlation, stepwise regression. The results showed that the growth rates and special motor abilities contribute to the selection of talented taekwondo. In addition, taekwondo players are characterized by flexibility, Endure performance and motor speed. The study concluded that there are five factors affect the selection of talented junior Taekwondo detailed flexibility contribute (28.8%, endure Performance contribute by (15.1%, ability contribute (7.8%, Growth rates (age, length, weight a contribution rate (5.2%, kinesthetic speed motor (in the level of the trunk - in the face level (1.1%. Predictable talented selection junior taekwondo through the following equation = 49.835 + Age (-0.389 + Length (0.157 + Weight (-0.188 + Flexibility (-0.359 + Ability (0.081 + Agility (-2.261 + Endure Performance (0.608 + Kinetics speed motor in the level of the trunk (0.586 + Kinetics speed motor in the face level (0.260. These results should be taken into account by the taekwondo Federation and trainers for use as an indicator for selecting talented taekwondo sport.

  3. Markov random field based automatic image alignment for electron tomography.

    Science.gov (United States)

    Amat, Fernando; Moussavi, Farshid; Comolli, Luis R; Elidan, Gal; Downing, Kenneth H; Horowitz, Mark

    2008-03-01

    We present a method for automatic full-precision alignment of the images in a tomographic tilt series. Full-precision automatic alignment of cryo electron microscopy images has remained a difficult challenge to date, due to the limited electron dose and low image contrast. These facts lead to poor signal to noise ratio (SNR) in the images, which causes automatic feature trackers to generate errors, even with high contrast gold particles as fiducial features. To enable fully automatic alignment for full-precision reconstructions, we frame the problem probabilistically as finding the most likely particle tracks given a set of noisy images, using contextual information to make the solution more robust to the noise in each image. To solve this maximum likelihood problem, we use Markov Random Fields (MRF) to establish the correspondence of features in alignment and robust optimization for projection model estimation. The resulting algorithm, called Robust Alignment and Projection Estimation for Tomographic Reconstruction, or RAPTOR, has not needed any manual intervention for the difficult datasets we have tried, and has provided sub-pixel alignment that is as good as the manual approach by an expert user. We are able to automatically map complete and partial marker trajectories and thus obtain highly accurate image alignment. Our method has been applied to challenging cryo electron tomographic datasets with low SNR from intact bacterial cells, as well as several plastic section and X-ray datasets.

  4. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  5. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  6. The Role of Automatic Obesity Stereotypes in Real Hiring Discrimination

    Science.gov (United States)

    Agerstrom, Jens; Rooth, Dan-Olof

    2011-01-01

    This study examined whether automatic stereotypes captured by the implicit association test (IAT) can predict real hiring discrimination against the obese. In an unobtrusive field experiment, job applications were sent to a large number of real job vacancies. The applications were matched on credentials but differed with respect to the applicant's…

  7. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, Anna Maria Merel; Zijlstra, I.A.; Gathier, C.S.; van den Berg, R.; Slump, Cornelis H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for

  8. Cognitive Function and Brain Atrophy Predict Non-pharmacological Efficacy in Dementia: The Mihama-Kiho Scan Project2

    Directory of Open Access Journals (Sweden)

    Ken-ichi Tabei

    2018-04-01

    Full Text Available We aimed to determine whether neuropsychological deficits and brain atrophy could predict the efficacy of non-pharmacological interventions. Forty-six participants with mild-to-moderate dementia were monitored for 6 months; 25 underwent an intervention involving physical exercise with music, and 21 performed cognitive stimulation tasks. Participants were categorized into improvement (IMP and no-IMP subgroups. In the exercise-with-music group, the no-IMP subgroup performed worse than the IMP subgroup on the Rivermead Behavioural Memory Test at baseline. In the cognitive-stimulation group, the no-IMP subgroup performed worse than the IMP subgroup on Raven’s Colored Progressive Matrices and the cognitive functional independence measure at baseline. In the no-IMP subgroup, voxel-based morphometric analysis at baseline revealed more extensive gray matter loss in the anterior cingulate gyrus and left middle frontal gyrus in the exercise-with-music and cognitive-stimulation groups, respectively. Participants with mild-to-moderate dementia with cognitive decline and extensive cortical atrophy are less likely to show improved cognitive function after non-pharmaceutical therapy.

  9. Cognitive Function and Brain Atrophy Predict Non-pharmacological Efficacy in Dementia: The Mihama-Kiho Scan Project2.

    Science.gov (United States)

    Tabei, Ken-Ichi; Satoh, Masayuki; Ogawa, Jun-Ichi; Tokita, Tomoko; Nakaguchi, Noriko; Nakao, Koji; Kida, Hirotaka; Tomimoto, Hidekazu

    2018-01-01

    We aimed to determine whether neuropsychological deficits and brain atrophy could predict the efficacy of non-pharmacological interventions. Forty-six participants with mild-to-moderate dementia were monitored for 6 months; 25 underwent an intervention involving physical exercise with music, and 21 performed cognitive stimulation tasks. Participants were categorized into improvement (IMP) and no-IMP subgroups. In the exercise-with-music group, the no-IMP subgroup performed worse than the IMP subgroup on the Rivermead Behavioural Memory Test at baseline. In the cognitive-stimulation group, the no-IMP subgroup performed worse than the IMP subgroup on Raven's Colored Progressive Matrices and the cognitive functional independence measure at baseline. In the no-IMP subgroup, voxel-based morphometric analysis at baseline revealed more extensive gray matter loss in the anterior cingulate gyrus and left middle frontal gyrus in the exercise-with-music and cognitive-stimulation groups, respectively. Participants with mild-to-moderate dementia with cognitive decline and extensive cortical atrophy are less likely to show improved cognitive function after non-pharmaceutical therapy.

  10. Predictive Factors for Verbal Memory Performance Over Decades of Aging: Data from the Women's Healthy Ageing Project.

    Science.gov (United States)

    Szoeke, Cassandra; Lehert, Philippe; Henderson, Victor W; Dennerstein, Lorraine; Desmond, Patricia; Campbell, Stephen

    2016-10-01

    Abnormalities in brain structure and function can occur several decades prior to the onset of cognitive decline. It is in the preceding decades that an intervention is most likely to be effective, when informed by an understanding of factors contributing to the disease prodrome. Few studies, however, have sufficient longitudinal data on relevant risks to determine the optimum targets for interventions to improve cognition in aging. In this article we examine the timing and exposure of factors contributing to verbal memory performance in later life. 387 participants from the population-based Women's Healthy Ageing Project, mean age at baseline of 49.6 years (range: 45-55 years), had complete neuropsychiatric assessments, clinical information, physical measures, and biomarkers collected at baseline, with at least three follow-up visits that included at least one cognitive reassessment. Mixed linear models were conducted to assess the significance of risk factors on later-life verbal memory. We explored the influence of early, contemporaneous, and cumulative exposures. Younger age and better education were associated with baseline memory test performance (CERAD). Over the 20 years of study follow-up, cumulative mid- to late-life physical activity had the strongest effect on better later life verbal memory (0.136 [0.058, 0.214]). The next most likely contributors to verbal memory in late life were the negative effect of cumulative hypertension (-0.033 [-0.047, -0.0.18] and the beneficial effect of HDL cholesterol (0.818 [0.042, 1.593]). Findings suggest that midlife interventions focused on physical activity, hypertension control, and achieving optimal levels of HDL cholesterol will help maintain later-life verbal memory skills. Copyright © 2016 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  11. SAFETY MANAGEMENT FOR WOMEN THROUGH AUTOMATIC GPS LOCATION TRACKER

    OpenAIRE

    P.Nivetha*1, S.Kiruthika2 & J.B.Kavitha3

    2018-01-01

    The project “SAFETY MANAGEMENT FOR WOMEN THROUGH AUTOMATIC GPS LOCATION TRACKER” is designed using Standard Android 4.0.3 platform. The platform used to develop the application is Eclipse IDE (Mars) with Java 1.6 Standard Edition. It’s an android app which will help people in their crucial time. For example if a person is in trouble and he needs a help so there should be an app through which he/she can contact with their one to help them by just clicking on one button, it will automatically s...

  12. A SIMULATION ENVIRONMENT FOR AUTOMATIC NIGHT DRIVING AND VISUAL CONTROL

    OpenAIRE

    Arroyo Rubio, Fernando

    2012-01-01

    This project consists on developing an automatic night driving system in a simulation environment. The simulator I have used is TORCS. TORCS is an Open Source car racing simulator written in C++. It is used as an ordinary car racing game, as a IA racing game and as a research platform. The goal of this thesis is to implement an automatic driving system to control the car under night conditions using computer vision. A camera is implemented inside the vehicle and it will detect the reflective ...

  13. Automatic Texture and Orthophoto Generation from Registered Panoramic Views

    DEFF Research Database (Denmark)

    Krispel, Ulrich; Evers, Henrik Leander; Tamke, Martin

    2015-01-01

    are automatically identified from the geometry and an image per view is created via projection. We combine methods of computer vision to train a classifier to detect the objects of interest from these orthographic views. Furthermore, these views can be used for automatic texturing of the proxy geometry....... from range data only. In order to detect these elements, we developed a method that utilizes range data and color information from high-resolution panoramic images of indoor scenes, taken at the scanners position. A proxy geometry is derived from the point clouds; orthographic views of the scene...

  14. REDUCING UNCERTAINTIES IN MODEL PREDICTIONS VIA HISTORY MATCHING OF CO2 MIGRATION AND REACTIVE TRANSPORT MODELING OF CO2 FATE AT THE SLEIPNER PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Chen

    2015-03-31

    An important question for the Carbon Capture, Storage, and Utility program is “can we adequately predict the CO2 plume migration?” For tracking CO2 plume development, the Sleipner project in the Norwegian North Sea provides more time-lapse seismic monitoring data than any other sites, but significant uncertainties still exist for some of the reservoir parameters. In Part I, we assessed model uncertainties by applying two multi-phase compositional simulators to the Sleipner Benchmark model for the uppermost layer (Layer 9) of the Utsira Sand and calibrated our model against the time-lapsed seismic monitoring data for the site from 1999 to 2010. Approximate match with the observed plume was achieved by introducing lateral permeability anisotropy, adding CH4 into the CO2 stream, and adjusting the reservoir temperatures. Model-predicted gas saturation, CO2 accumulation thickness, and CO2 solubility in brine—none were used as calibration metrics—were all comparable with the interpretations of the seismic data in the literature. In Part II & III, we evaluated the uncertainties of predicted long-term CO2 fate up to 10,000 years, due to uncertain reaction kinetics. Under four scenarios of the kinetic rate laws, the temporal and spatial evolution of CO2 partitioning into the four trapping mechanisms (hydrodynamic/structural, solubility, residual/capillary, and mineral) was simulated with ToughReact, taking into account the CO2-brine-rock reactions and the multi-phase reactive flow and mass transport. Modeling results show that different rate laws for mineral dissolution and precipitation reactions resulted in different predicted amounts of trapped CO2 by carbonate minerals, with scenarios of the conventional linear rate law for feldspar dissolution having twice as much mineral trapping (21% of the injected CO2) as scenarios with a Burch-type or Alekseyev et al.–type rate law for feldspar dissolution (11%). So far, most reactive transport modeling (RTM) studies for

  15. Verification of analysis methods for predicting the behaviour of seismically isolated nuclear structures. Final report of a co-ordinated research project 1996-1999

    International Nuclear Information System (INIS)

    2002-06-01

    This report is a summary of the work performed under a co-ordinated research project (CRP) entitled Verification of Analysis Methods for Predicting the Behaviour of Seismically isolated Nuclear Structures. The project was organized by the IAEA on the recommendation of the IAEA's Technical Working Group on Fast Reactors (TWGFR) and carried out from 1996 to 1999. One of the primary requirements for nuclear power plants and facilities is to ensure safety and the absence of damage under strong external dynamic loading from, for example, earthquakes. The designs of liquid metal cooled fast reactors (LMFRs) include systems which operate at low pressure and include components which are thin-walled and flexible. These systems and components could be considerably affected by earthquakes in seismic zones. Therefore, the IAEA through its advanced reactor technology development programme supports the activities of Member States to apply seismic isolation technology to LMFRs. The application of this technology to LMFRs and other nuclear plants and related facilities would offer the advantage that standard designs may be safely used in areas with a seismic risk. The technology may also provide a means of seismically upgrading nuclear facilities. Design analyses applied to such critical structures need to be firmly established, and the CRP provided a valuable tool in assessing their reliability. Ten organizations from India, Italy, Japan, the Republic of Korea, the Russian Federation, the United Kingdom, the United States of America and the European Commission co-operated in this CRP. This report documents the CRP activities, provides the main results and recommendations and includes the work carried out by the research groups at the participating institutes within the CRP on verification of their analysis methods for predicting the behaviour of seismically isolated nuclear structures

  16. Contact with Counter-Stereotypical Women Predicts Less Sexism, Less Rape Myth Acceptance, Less Intention to Rape (in Men) and Less Projected Enjoyment of Rape (in Women).

    Science.gov (United States)

    Taschler, Miriam; West, Keon

    2017-01-01

    Intergroup contact-(positive) interactions with people from different social groups-is a widely researched and strongly supported prejudice-reducing mechanism shown to reduce prejudice against a wide variety of outgroups. However, no known previous research has investigated whether intergroup contact can also reduce sexism against women. Sexism has an array of negative outcomes. One of the most detrimental and violent ones is rape, which is both justified and downplayed by rape myth acceptance. We hypothesised that more frequent, higher quality contact with counter-stereotypical women would predict lower levels of sexism and thus less rape myth acceptance (in men) and less sexualised projected responses to rape (in women). Two studies using online surveys with community samples supported these hypotheses. In Study 1, 170 male participants who experienced more positive contact with counter-stereotypical women reported less intention to rape. Similarly, in Study 2, 280 female participants who experienced more positive contact with counter-stereotypical women reported less projected sexual arousal at the thought of being raped. Thus, the present research is the first known to show that contact could be a potential tool to combat sexism, rape myth acceptance, intentions to rape in men, and sexualisation of rape by women.

  17. Automatic and strategic measures as predictors of mirror gazing among individuals with body dysmorphic disorder symptoms.

    Science.gov (United States)

    Clerkin, Elise M; Teachman, Bethany A

    2009-08-01

    The current study tests cognitive-behavioral models of body dysmorphic disorder (BDD) by examining the relationship between cognitive biases and correlates of mirror gazing. To provide a more comprehensive picture, we investigated both relatively strategic (i.e., available for conscious introspection) and automatic (i.e., outside conscious control) measures of cognitive biases in a sample with either high (n = 32) or low (n = 31) BDD symptoms. Specifically, we examined the extent that (1) explicit interpretations tied to appearance, as well as (2) automatic associations and (3) strategic evaluations of the importance of attractiveness predict anxiety and avoidance associated with mirror gazing. Results indicated that interpretations tied to appearance uniquely predicted self-reported desire to avoid, whereas strategic evaluations of appearance uniquely predicted peak anxiety associated with mirror gazing, and automatic appearance associations uniquely predicted behavioral avoidance. These results offer considerable support for cognitive models of BDD, and suggest a dissociation between automatic and strategic measures.

  18. Automatic and Strategic Measures as Predictors of Mirror Gazing Among Individuals with Body Dysmorphic Disorder Symptoms

    Science.gov (United States)

    Clerkin, Elise M.; Teachman, Bethany A.

    2011-01-01

    The current study tests cognitive-behavioral models of body dysmorphic disorder (BDD) by examining the relationship between cognitive biases and correlates of mirror gazing. To provide a more comprehensive picture, we investigated both relatively strategic (i.e., available for conscious introspection) and automatic (i.e., outside conscious control) measures of cognitive biases in a sample with either high (n=32) or low (n=31) BDD symptoms. Specifically, we examined the extent that 1) explicit interpretations tied to appearance, as well as 2) automatic associations and 3) strategic evaluations of the importance of attractiveness predict anxiety and avoidance associated with mirror gazing. Results indicated that interpretations tied to appearance uniquely predicted self-reported desire to avoid, while strategic evaluations of appearance uniquely predicted peak anxiety associated with mirror gazing, and automatic appearance associations uniquely predicted behavioral avoidance. These results offer considerable support for cognitive models of BDD, and suggest a dissociation between automatic and strategic measures. PMID:19684496

  19. Automatic sprinkler system performance and reliability in United States Department of Energy Facilities, 1952 to 1980

    International Nuclear Information System (INIS)

    1982-06-01

    The automatic sprinkler system experiences of the United States Department of Energy and its predecessor agencies are analyzed. Based on accident and incident files in the Office of Operational Safety and on supplementary responses, 587 incidents including over 100 fires are analyzed. Tables and figures, with supplementary narratives discuss fire experience by various categories such as number of heads operating, type of system, dollar losses, failures, extinguished vs. controlled, and types of sprinkler heads. Use is made of extreme value projections and frequency-severity plots to compare past experience and predict future experience. Non-fire incidents are analyzed in a similar manner by cause, system types and failure types. Discussion of no-loss incidents and non-fire protection water systems is included. The author's conclusions and recommendations and appendices listing survey methodology, major incidents, and a bibliography are included

  20. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  1. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  2. Automatic pitch detection for a computer game interface

    International Nuclear Information System (INIS)

    Fonseca Solis, Juan M.

    2015-01-01

    A software able to recognize notes played by musical instruments is created through automatic pitch recognition. A pitch recognition algorithm is embedded into a software project, using the C implementation of SWIPEP. A memory game is chosen for project. A sequence of notes is listened and played by user to the computer, using a soprano recorder flute. The basic concepts to understand the acoustic phenomena involved are explained. The paper is aimed for all students with basic programming knowledge and want to incorporate sound processing to their projects. (author) [es

  3. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  4. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  5. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  6. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  7. Automatic production of Iodine-123 with PLC 135/U

    International Nuclear Information System (INIS)

    Moghaddam-Banaem, L.; Afarideh, H.

    2004-01-01

    In this project, the automatic system for production of Iodine-123 with PLC/135μ Siemens, which is designed and installed for the first time in Iran, is discussed. The PLC (Programmable Logic Controller) is used to control industrial processing, which is similar to a computer and consists of central processing unit and memory and Input/Output units. PLC receives input information from auxiliary units such as sensors, switches, etc. and software processes data in memory and then sends commands to output units such as relays, motors, etc.The target section in Iodine production consists of 8 stages. In order to be sure automation works properly the system can be operated both manually and automatically. First PLC checks Manual/Automatic switch and in the case of automatic mode, PLC runs the program in memory and processing is done automatically. For this purpose, PLC takes the value of pressures and temperatures from analog inputs and after processing them it sends commands to digital output to activate valves or vacuum pumps or heaters. In this paper the following subjects are discussed: 1) Production of Iodine 123 2) PLC structure and auxiliary boards 3) Sensors and actuators and their connection to PLC 4) Software flowchart

  8. National Centers for Environmental Prediction-Department of Energy (NCEP-DOE) Atmospheric Model Intercomparison Project (AMIP)-II Reanalysis (Reanalysis-2)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NCEP-DOE Atmospheric Model Intercomparison Project (AMIP-II) reanalysis is a follow-on project to the "50-year" (1948-present) NCEP-NCAR Reanalysis Project....

  9. Automatic scanning of NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At the European Laboratory for Particle Physics CERN, personal neutron monitoring for over 4000 collaborators is performed with Kodak NTA film, one of the few suitable dosemeters in the stray radiation environment of a high energy accelerator. After development, films are scanned with a projection microscope. To overcome this lengthy and strenuous procedure an automated analysis system for the dosemeters has been developed. General purpose image recognition software, tailored to the specific needs with a macro language, analyses the digitised microscope image. This paper reports on the successful automatic scanning of NTA films irradiated with neutrons from a /sup 238/Pu-Be source (E approximately=4 MeV), as well as on the extension of the method to neutrons of higher energies. The question of detection limits is discussed in the light of an application of the method in routine personal neutron monitoring. (9 refs).

  10. The automatic lumber planing mill

    Science.gov (United States)

    Peter Koch

    1957-01-01

    It is probable that a truly automatic planning operation could be devised if some of the variables commonly present in the mill-run lumber were eliminated and the remaining variables kept under close control. This paper will deal with the more general situation faced by mostl umber manufacturing plants. In other words, it will be assumed that the incoming lumber has...

  11. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  12. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  13. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  14. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  15. The automaticity of vantage point shifts within a synaesthetes' spatial calendar.

    Science.gov (United States)

    Jarick, Michelle; Jensen, Candice; Dixon, Michael J; Smilek, Daniel

    2011-09-01

    Time-space synaesthetes report that time units (e.g., months, days, hours) occupy idiosyncratic spatial locations. For the synaesthete (L), the months of the year are projected out in external space in the shape of a 'scoreboard 7', where January to July extend across the top from left to right and August to December make up the vertical segment from top to bottom. Interestingly, L can change the mental vantage point (MVP) from where she views her month-space depending on whether she sees or hears the month name. We used a spatial cueing task to demonstrate that L's attention could be directed to locations within her time-space and change vantage points automatically - from trial to trial. We also sought to eliminate any influence of strategy on L's performance by shortening the interval between the cue and target onset to only 150 ms, and have the targets fall in synaesthetically cued locations on only 15% of trials. If L's performance was attributable to intentionally using the cue to predict target location, these manipulations should eliminate any cueing effects. In two separate experiments, we found that L still showed an attentional bias consistent with her synaesthesia. Thus, we attribute L's rapid and resilient cueing effects to the automaticity of her spatial forms. ©2011 The British Psychological Society.

  16. Examination of imaging detectors for combined radiography procedures in the ACCIS joint project. Automatic cargo container inspection system. Final report; Untersuchung von bildgebenden Detektoren fuer kombinierte Radiographieverfahren im Verbundprojekt ACCIS. Automatisches Cargo-Container Inspektionssystem. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Dangendorf, Volker

    2014-06-09

    Currently used screening systems of air cargo are based on X-ray radiation from bremsstrahlung generators. Thus, different substances from light elements of approximately the same density are difficult to distinguish, e.g. the image contrast between explosives and drugs is low compared to harmless organic substances, such as plastic parts or foodstuffs, and requires extensive follow-up investigations. On the other hand, the image contrast is also low in the case of heavy elements with X-ray methods, e.g. Special Nuclear Materials (SNM) such as Pu and.U, which are also transported in a container of lead for camouflage and mixed with goods from other heavy metals, makes it very difficult. Within the framework of the ACCIS Collaborative Project, a new inspection system for airfreight based on neutron and gamma irradiation was researched. Within this framework, the PTB subproject covered the following tasks: 1. Research and development of laboratory prototypes of imaging radiation detectors; 2. Development of a measuring station for the evaluation of the screening method at the PTB accelerator system, 3. Cooperation in the development of a concept for a pulsed radiation source, in particular design and investigation of the beam-producing target. 4. Determination of the physical and dosimetric parameters relevant to radiation protection. Examination of the conditions of application, requirement of operational facility, end user contacts; 6. Coordination of the German partners, in particular organization of the project meetings of the German and Israeli partners. [German] Derzeitig eingesetzte Durchleuchtungssysteme von Luftfrachtcontainern basieren auf Roentgenstrahlung aus Bremsstrahlungsgeneratoren. Damit lassen sich unterschiedliche Stoffe aus leichten Elementen annaehernd gleicher Dichte nur schwer unterscheiden, z.B. ist der Bildkontrast zwischen Sprengstoffen und Drogen gegenueber harmlosen organischen Stoffen wie Kunststoffteilen oder Lebensmitteln gering und

  17. Long-term prediction of prostate cancer diagnosis and death using PSA and obesity related anthropometrics at early middle age: data from the malmö preventive project.

    Science.gov (United States)

    Assel, Melissa J; Gerdtsson, Axel; Thorek, Daniel L J; Carlsson, Sigrid V; Malm, Johan; Scardino, Peter T; Vickers, Andrew; Lilja, Hans; Ulmert, David

    2018-01-19

    To evaluate whether anthropometric parameters add to PSA measurements in middle-aged men for risk assessment of prostate cancer (PCa) diagnosis and death. After adjusting for PSA, both BMI and weight were significantly associated with an increased risk of PCa death with the odds of a death corresponding to a 10 kg/m2 or 10 kg increase being 1.58 (95% CI 1.10, 2.28; p = 0.013) and 1.14 (95% CI 1.02, 1.26; p = 0.016) times greater, respectively. AUCs did not meaningfully increase with the addition of weight or BMI to prediction models including PSA. In 1974 to 1986, 22,444 Swedish men aged 44 to 50 enrolled in Malmö Preventive Project, Sweden, and provided blood samples and anthropometric data. Rates of PSA screening in the cohort were very low. Documentation of PCa diagnosis and disease-specific death up to 2014 was retrieved through national registries. Among men with anthropometric measurements available at baseline, a total of 1692 men diagnosed with PCa were matched to 4190 controls, and 464 men who died of disease were matched to 1390 controls. Multivariable conditional logistic regression was used to determine whether diagnosis or death from PCa were associated with weight and body mass index (BMI) at adulthood after adjusting for PSA. Men with higher BMI and weight at early middle age have an increased risk of PCa diagnosis and death after adjusting for PSA. However, in a multi-variable numerical statistical model, BMI and weight do not importantly improve the predictive accuracy of PSA. Risk-stratification of screening should be based on PSA without reference to anthropometrics.

  18. ONEMercury: Towards Automatic Annotation of Earth Science Metadata

    Science.gov (United States)

    Tuarob, S.; Pouchard, L. C.; Noy, N.; Horsburgh, J. S.; Palanisamy, G.

    2012-12-01

    Earth sciences have become more data-intensive, requiring access to heterogeneous data collected from multiple places, times, and thematic scales. For example, research on climate change may involve exploring and analyzing observational data such as the migration of animals and temperature shifts across the earth, as well as various model-observation inter-comparison studies. Recently, DataONE, a federated data network built to facilitate access to and preservation of environmental and ecological data, has come to exist. ONEMercury has recently been implemented as part of the DataONE project to serve as a portal for discovering and accessing environmental and observational data across the globe. ONEMercury harvests metadata from the data hosted by multiple data repositories and makes it searchable via a common search interface built upon cutting edge search engine technology, allowing users to interact with the system, intelligently filter the search results on the fly, and fetch the data from distributed data sources. Linking data from heterogeneous sources always has a cost. A problem that ONEMercury faces is the different levels of annotation in the harvested metadata records. Poorly annotated records tend to be missed during the search process as they lack meaningful keywords. Furthermore, such records would not be compatible with the advanced search functionality offered by ONEMercury as the interface requires a metadata record be semantically annotated. The explosion of the number of metadata records harvested from an increasing number of data repositories makes it impossible to annotate the harvested records manually, urging the need for a tool capable of automatically annotating poorly curated metadata records. In this paper, we propose a topic-model (TM) based approach for automatic metadata annotation. Our approach mines topics in the set of well annotated records and suggests keywords for poorly annotated records based on topic similarity. We utilize the

  19. Early maladaptive schemas and social anxiety in adolescents: the mediating role of anxious automatic thoughts.

    Science.gov (United States)

    Calvete, Esther; Orue, Izaskun; Hankin, Benjamin L

    2013-04-01

    Cognitive models state that cognitions are organized hierarchically, so that the underlying schemas affect behavior via more automatic, superficial cognitive processes. This study aimed to demonstrate that early maladaptive schemas predict anxious automatic thoughts, and to show that such automatic thoughts act as mediators between schemas and prospective changes in social anxiety symptoms. The study also examined an alternative reverse model in which schemas acted as mediators between automatic thoughts and social anxiety. A total of 1052 adolescents (499 girls and 553 boys; M(age)=13.43; SD(age)=1.29) completed measures of early maladaptive schemas, socially anxious automatic thoughts, and social anxiety symptoms at Times 1, 2, and 3. The results revealed bidirectional longitudinal relationships among schemas and automatic thoughts that were consistent in content (e.g., the disconnection/rejection schemas and automatic thoughts of negative self-concept). Furthermore, the automatic thoughts of anticipatory negative evaluation by others at Time 2 mediated the relationship between the other-directedness schemas at Time 1 and social anxiety symptoms at Time 3. These findings are consistent with hierarchical cognitive models of social anxiety given that deeper schemas predict more surface-level thoughts. They also support that these more surface-level thoughts contribute to perpetuating schemas. Finally, results show that early maladaptive schemas of the other-directedness domain play a relevant role in the development and maintenance of social anxiety. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Acquisition of automatic imitation is sensitive to sensorimotor contingency.

    Science.gov (United States)

    Cook, Richard; Press, Clare; Dickinson, Anthony; Heyes, Cecilia

    2010-08-01

    The associative sequence learning model proposes that the development of the mirror system depends on the same mechanisms of associative learning that mediate Pavlovian and instrumental conditioning. To test this model, two experiments used the reduction of automatic imitation through incompatible sensorimotor training to assess whether mirror system plasticity is sensitive to contingency (i.e., the extent to which activation of one representation predicts activation of another). In Experiment 1, residual automatic imitation was measured following incompatible training in which the action stimulus was a perfect predictor of the response (contingent) or not at all predictive of the response (noncontingent). A contingency effect was observed: There was less automatic imitation indicative of more learning in the contingent group. Experiment 2 replicated this contingency effect and showed that, as predicted by associative learning theory, it can be abolished by signaling trials in which the response occurs in the absence of an action stimulus. These findings support the view that mirror system development depends on associative learning and indicate that this learning is not purely Hebbian. If this is correct, associative learning theory could be used to explain, predict, and intervene in mirror system development.

  1. Predictive models for suicidal thoughts and behaviors among Spanish University students: rationale and methods of the UNIVERSAL (University & mental health) project.

    Science.gov (United States)

    Blasco, Maria Jesús; Castellví, Pere; Almenara, José; Lagares, Carolina; Roca, Miquel; Sesé, Albert; Piqueras, José Antonio; Soto-Sanz, Victoria; Rodríguez-Marín, Jesús; Echeburúa, Enrique; Gabilondo, Andrea; Cebrià, Ana Isabel; Miranda-Mendizábal, Andrea; Vilagut, Gemma; Bruffaerts, Ronny; Auerbach, Randy P; Kessler, Ronald C; Alonso, Jordi

    2016-05-04

    Suicide is a leading cause of death among young people. While suicide prevention is considered a research and intervention priority, longitudinal data is needed to identify risk and protective factors associate with suicidal thoughts and behaviors. Here we describe the UNIVERSAL (University and Mental Health) project which aims are to: (1) test prevalence and 36-month incidence of suicidal thoughts and behaviors; and (2) identify relevant risk and protective factors associated with the incidence of suicidal thoughts and behaviors among university students in Spain. An ongoing multicenter, observational, prospective cohort study of first year university students in 5 Spanish universities. Students will be assessed annually during a 36 month follow-up. The surveys will be administered through an online, secure web-based platform. A clinical reappraisal will be completed among a subsample of respondents. Suicidal thoughts and behaviors will be assess with the Self-Injurious Thoughts and Behaviors Interview (SITBI) and the Columbia-Suicide Severity Rating Scale (C-SSRS). Risk and protective factors will include: mental disorders, measured with the Composite International Diagnostic Interview version 3.0 (CIDI 3.0) and Screening Scales (CIDI-SC), and the Epi-Q Screening Survey (EPI-Q-SS), socio-demographic variables, self-perceived health status, health behaviors, well-being, substance use disorders, service use and treatment. The UNIVERSAL project is part of the International College Surveys initiative, which is a core project within the World Mental Health consortium. Lifetime and the 12-month prevalence will be calculated for suicide ideation, plans and attempts. Cumulative incidence of suicidal thoughts and behaviors, and mental disorders will be measured using the actuarial method. Risk and protective factors of suicidal thoughts and behaviors will be analyzed by Cox proportional hazard models. The study will provide valid, innovative and useful data for developing

  2. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  3. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. Implementation of automatic protection switching in an optical cross connect

    OpenAIRE

    Uy, Jason

    2005-01-01

    Having a reliable network is a hard requirement for Telecommunication companies when deploying new networks. Service providers and enterprise customers lose a lot of money any time an interruption of internet service occurs. The SONETISDH specification specifies several different types of topology that support redundancy. An Automatic Protection Switching (APS) mechanism is specified for each topology to dictate how a network behaves in a failure event. For this project, a software implementa...

  5. Automatic segmentation of mandible in panoramic x-ray

    OpenAIRE

    Abdi, Amir Hossein; Kasaei, Shohreh; Mehdizadeh, Mojdeh

    2015-01-01

    As the panoramic x-ray is the most common extraoral radiography in dentistry, segmentation of its anatomical structures facilitates diagnosis and registration of dental records. This study presents a fast and accurate method for automatic segmentation of mandible in panoramic x-rays. In the proposed four-step algorithm, a superior border is extracted through horizontal integral projections. A modified Canny edge detector accompanied by morphological operators extracts the inferior border of t...

  6. Application of the Min-Projection and the Model Predictive Strategies for Current Control of Three-Phase Grid-Connected Converters: a Comparative Study

    Directory of Open Access Journals (Sweden)

    M. Oloumi

    2015-06-01

    Full Text Available This paper provides a detailed comparative study concerning the performance of min-projection strategy (MPS and model predictive control (MPC systems to control the three-phase grid connected converters. To do so, first, the converter is modeled as a switched linear system. Then, the feasibility of the MPS technique is investigated and its stability criterion is derived as a lower limit on the DC link voltage. Next, the fundamental equations of the MPS to control a VSC are obtained in the stationary reference frame. The mathematical analysis reveals that the MPS is independent of the load, grid, filter and converter parameters. This feature is a great advantage of MPS over the MPC approach. However, the latter is a well-known model-based control technique, has already developed for controlling the VSC in the stationary reference frame. To control the grid connected VSC, both MPS and MPC approaches are simulated in the PSCAD/EMTDC environment. Simulation results illustrate that the MPS is functioning well and is less sensitive to grid and filter inductances as well as the DC link voltage level. However, the MPC approach renders slightly a better performance in the steady state conditions.

  7. Automatic Task Classification via Support Vector Machine and Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Hyungsik Shin

    2018-01-01

    Full Text Available Automatic task classification is a core part of personal assistant systems that are widely used in mobile devices such as smartphones and tablets. Even though many industry leaders are providing their own personal assistant services, their proprietary internals and implementations are not well known to the public. In this work, we show through real implementation and evaluation that automatic task classification can be implemented for mobile devices by using the support vector machine algorithm and crowdsourcing. To train our task classifier, we collected our training data set via crowdsourcing using the Amazon Mechanical Turk platform. Our classifier can classify a short English sentence into one of the thirty-two predefined tasks that are frequently requested while using personal mobile devices. Evaluation results show high prediction accuracy of our classifier ranging from 82% to 99%. By using large amount of crowdsourced data, we also illustrate the relationship between training data size and the prediction accuracy of our task classifier.

  8. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  9. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  10. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  11. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  12. Automatic Conflict Detection on Contracts

    Science.gov (United States)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  13. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  14. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  15. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  16. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  17. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  18. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  19. Automatic wipers with mist control

    OpenAIRE

    Ashik K.P; A.N.Basavaraju

    2016-01-01

    - This paper illustrates Automatic wipers with mist control. In modern days, the accidents are most common in commercial vehicles. One of the reasons for these accidents is formation of the mist inside the vehicle due to heavy rain. In rainy seasons for commercial vehicles, the wiper on the windshield has to be controlled by the driver himself, which distracts his concentration on driving. Also when the rain lasts for more time (say for about 15 minutes) the formation of mist on t...

  20. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes —are sensitive to the business cycle and account for most of...Medicare taxes for self-employed people, taxes on production and imports, and unemployment insurance taxes . Those six categories account for the bulk of...federal tax revenues.6 Individual taxes account for most of the automatic stabilizers from revenues, followed by Social Security plus Medicare

  1. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  2. Towards Automatic Personalized Content Generation for Platform Games

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2010-01-01

    In this paper, we show that personalized levels can be automatically generated for platform games. We build on previous work, where models were derived that predicted player experience based on features of level design and on playing styles. These models are constructed using preference learning...... mechanism using both algorithmic and human players. The results indicate that the adaptation mechanism effectively optimizes level design parameters for particular players....

  3. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  4. Automatic Migration from PARMACS to MPI in Parallel Fortran Applications

    Directory of Open Access Journals (Sweden)

    Rolf Hempel

    1999-01-01

    Full Text Available The PARMACS message passing interface has been in widespread use by application projects, especially in Europe. With the new MPI standard for message passing, many projects face the problem of replacing PARMACS with MPI. An automatic translation tool has been developed which replaces all PARMACS 6.0 calls in an application program with their corresponding MPI calls. In this paper we describe the mapping of the PARMACS programming model onto MPI. We then present some implementation details of the converter tool.

  5. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  6. Real time automatic scene classification

    NARCIS (Netherlands)

    Verbrugge, R.; Israël, Menno; Taatgen, N.; van den Broek, Egon; van der Putten, Peter; Schomaker, L.; den Uyl, Marten J.

    2004-01-01

    This work has been done as part of the EU VICAR (IST) project and the EU SCOFI project (IAP). The aim of the first project was to develop a real time video indexing classification annotation and retrieval system. For our systems, we have adapted the approach of Picard and Minka [3], who categorized

  7. Automatic initialization and quality control of large-scale cardiac MRI segmentations.

    Science.gov (United States)

    Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F

    2018-01-01

    Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The

  8. Schools water efficiency and awareness project

    African Journals Online (AJOL)

    driniev

    2002-04-23

    Apr 23, 2002 ... Schools Water Efficiency Project in February 2003, which supports several of the City's Water Demand ... of automatic flushing urinals (AFUs) alone in schools can save up .... to go back into the bag as the cistern is filling.

  9. Robust methods for automatic image-to-world registration in cone-beam CT interventional guidance

    International Nuclear Information System (INIS)

    Dang, H.; Otake, Y.; Schafer, S.; Stayman, J. W.; Kleinszig, G.; Siewerdsen, J. H.

    2012-01-01

    Purpose: Real-time surgical navigation relies on accurate image-to-world registration to align the coordinate systems of the image and patient. Conventional manual registration can present a workflow bottleneck and is prone to manual error and intraoperator variability. This work reports alternative means of automatic image-to-world registration, each method involving an automatic registration marker (ARM) used in conjunction with C-arm cone-beam CT (CBCT). The first involves a Known-Model registration method in which the ARM is a predefined tool, and the second is a Free-Form method in which the ARM is freely configurable. Methods: Studies were performed using a prototype C-arm for CBCT and a surgical tracking system. A simple ARM was designed with markers comprising a tungsten sphere within infrared reflectors to permit detection of markers in both x-ray projections and by an infrared tracker. The Known-Model method exercised a predefined specification of the ARM in combination with 3D-2D registration to estimate the transformation that yields the optimal match between forward projection of the ARM and the measured projection images. The Free-Form method localizes markers individually in projection data by a robust Hough transform approach extended from previous work, backprojected to 3D image coordinates based on C-arm geometric calibration. Image-domain point sets were transformed to world coordinates by rigid-body point-based registration. The robustness and registration accuracy of each method was tested in comparison to manual registration across a range of body sites (head, thorax, and abdomen) of interest in CBCT-guided surgery, including cases with interventional tools in the radiographic scene. Results: The automatic methods exhibited similar target registration error (TRE) and were comparable or superior to manual registration for placement of the ARM within ∼200 mm of C-arm isocenter. Marker localization in projection data was robust across all

  10. Fully automatic time-window selection using machine learning for global adjoint tomography

    Science.gov (United States)

    Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.

    2017-12-01

    Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error

  11. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  12. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  13. On stylistic automatization of lexical units in various types of contexts

    Directory of Open Access Journals (Sweden)

    В В Зуева

    2009-12-01

    Full Text Available Stylistic automatization of lexical units in various types of contexts is investigated in this article. Following the works of Boguslav Havranek and other linguists of the Prague Linguistic School automatization is treated as a contextual narrowing of the meaning of a lexical unit to the level of its complete predictability in situational contexts and the lack of stylistic contradiction with other lexical units in speech.

  14. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  15. Automatic Strain-Rate Controller,

    Science.gov (United States)

    1976-12-01

    D—AO37 9~e2 ROME AIR DEVELOPMENT CENTER GRIFFISS AFB N 1’ FIG 13/ 6AUTOMATIC STRAIN—RATE CONTROLLER, (U) DEC 76 R L HUNTSINGER. J A ADAMSK I...goes to zero. CONTROLLER, Leeds and Northrup Series 80 CAT with proportional band , rate , reset, and approach controls . Input from deviation output...8) through ( 16) . (8) Move the set-point slowl y up to 3 or 4. (9) If the recorder po inter hunts , adjust the func t ion controls on tine Ser

  16. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    A commutated automatic gain control (AGC) system was designed and built for a prototype Loran C receiver. The receiver uses a microcomputer to control a memory aided phase-locked loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The circuit designed for the AGC is described, and bench and flight test results are presented. The AGC circuit described actually samples starting at a point 40 microseconds after a zero crossing determined by the software lock pulse ultimately generated by a 30 microsecond delay and add network in the receiver front end envelope detector.

  17. Automatic liquid nitrogen feeding device

    International Nuclear Information System (INIS)

    Gillardeau, J.; Bona, F.; Dejachy, G.

    1963-01-01

    An automatic liquid nitrogen feeding device has been developed (and used) in the framework of corrosion tests realized with constantly renewed uranium hexafluoride. The issue was to feed liquid nitrogen to a large capacity metallic trap in order to condensate uranium hexafluoride at the exit of the corrosion chambers. After having studied various available devices, a feeding device has been specifically designed to be robust, secure and autonomous, as well as ensuring a high liquid nitrogen flowrate and a highly elevated feeding frequency. The device, made of standard material, has been used during 4000 hours without any problem [fr

  18. Automatic alignment of radionuclide images

    International Nuclear Information System (INIS)

    Barber, D.C.

    1982-01-01

    The variability of the position, dimensions and orientation of a radionuclide image within the field of view of a gamma camera hampers attempts to analyse the image numerically. This paper describes a method of using a set of training images of a particular type, in this case right lateral brain images, to define the likely variations in the position, dimensions and orientation for that type of image and to provide alignment data for a program that automatically aligns new images of the specified type to a standard position, size and orientation. Examples are given of the use of this method on three types of radionuclide image. (author)

  19. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  20. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  1. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  2. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  3. Leveraging Quality Prediction Models for Automatic Writing Feedback

    OpenAIRE

    Nilforoshan, Hamed; Wu, Eugene

    2017-01-01

    User-generated, multi-paragraph writing is pervasive and important in many social media platforms (i.e. Amazon reviews, AirBnB host profiles, etc). Ensuring high-quality content is important. Unfortunately, content submitted by users is often not of high quality. Moreover, the characteristics that constitute high quality may even vary between domains in ways that users are unaware of. Automated writing feedback has the potential to immediately point out and suggest improvements during the wri...

  4. Subjective fear, interference by threat, and fear associations independently predict fear-related behavior in children

    NARCIS (Netherlands)

    Klein, A.M.; Kleinherenbrink, A.V.; Simons, C.; de Gier, E.; Klein, S.; Allart, E.; Bögels, S.M.; Becker, E.S.; Rinck, M.

    2012-01-01

    Background and objectives: Several information-processing models highlight the independent roles of controlled and automatic processes in explaining fearful behavior. Therefore, we investigated whether direct measures of controlled processes and indirect measures of automatic processes predict

  5. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  6. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  7. An automatic holographic adaptive phoropter

    Science.gov (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam

    2017-08-01

    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  8. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  9. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  10. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  11. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  12. Automatic referral to cardiac rehabilitation.

    Science.gov (United States)

    Fischer, Jane P

    2008-01-01

    The pervasive negative impact of cardiovascular disease in the United States is well documented. Although advances have been made, the campaign to reduce the occurrence, progression, and mortality continues. Determining evidence-based data is only half the battle. Implementing new and updated clinical guidelines into daily practice is a challenging task. Cardiac rehabilitation is an example of a proven intervention whose benefit is hindered through erratic implementation. The American Association of Cardiovascular and Pulmonary Rehabilitation (AACVPR), the American College of Cardiology (ACC), and the American Heart Association (AHA) have responded to this problem by publishing the AACVPR/ACC/AHA 2007 Performance Measures on Cardiac Rehabilitation for Referral to and Delivery of Cardiac Rehabilitation/Secondary Prevention Services. This new national guideline recommends automatic referral to cardiac rehabilitation for every eligible patient (performance measure A-1). This article offers guidance for the initiation of an automatic referral system, including individualizing your protocol with regard to electronic or paper-based order entry structures.

  13. Diagnosis - Using automatic test equipment and artificial intelligence expert systems

    Science.gov (United States)

    Ramsey, J. E., Jr.

    Three expert systems (ATEOPS, ATEFEXPERS, and ATEFATLAS), which were created to direct automatic test equipment (ATE), are reviewed. The purpose of the project was to develop an expert system to troubleshoot the converter-programmer power supply card for the F-15 aircraft and have that expert system direct the automatic test equipment. Each expert system uses a different knowledge base or inference engine, basing the testing on the circuit schematic, test requirements document, or ATLAS code. Implementing generalized modules allows the expert systems to be used for any different unit under test. Using converted ATLAS to LISP code allows the expert system to direct any ATE using ATLAS. The constraint propagated frame system allows for the expansion of control by creating the ATLAS code, checking the code for good software engineering techniques, directing the ATE, and changing the test sequence as needed (planning).

  14. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  15. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  16. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  17. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-01-01

    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  18. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico; Kryshtafovych, Andriy; Tramontano, Anna

    2009-01-01

    established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic

  19. Automatic Tortuosity-Based Retinopathy of Prematurity Screening System

    Science.gov (United States)

    Sukkaew, Lassada; Uyyanonvara, Bunyarit; Makhanov, Stanislav S.; Barman, Sarah; Pangputhipong, Pannet

    Retinopathy of Prematurity (ROP) is an infant disease characterized by increased dilation and tortuosity of the retinal blood vessels. Automatic tortuosity evaluation from retinal digital images is very useful to facilitate an ophthalmologist in the ROP screening and to prevent childhood blindness. This paper proposes a method to automatically classify the image into tortuous and non-tortuous. The process imitates expert ophthalmologists' screening by searching for clearly tortuous vessel segments. First, a skeleton of the retinal blood vessels is extracted from the original infant retinal image using a series of morphological operators. Next, we propose to partition the blood vessels recursively using an adaptive linear interpolation scheme. Finally, the tortuosity is calculated based on the curvature of the resulting vessel segments. The retinal images are then classified into two classes using segments characterized by the highest tortuosity. For an optimal set of training parameters the prediction is as high as 100%.

  20. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  1. CERN automatic audio-conference service

    International Nuclear Information System (INIS)

    Sierra Moral, Rodrigo

    2010-01-01

    Scientists from all over the world need to collaborate with CERN on a daily basis. They must be able to communicate effectively on their joint projects at any time; as a result telephone conferences have become indispensable and widely used. Managed by 6 operators, CERN already has more than 20000 hours and 5700 audio-conferences per year. However, the traditional telephone based audio-conference system needed to be modernized in three ways. Firstly, to provide the participants with more autonomy in the organization of their conferences; secondly, to eliminate the constraints of manual intervention by operators; and thirdly, to integrate the audio-conferences into a collaborative working framework. The large number, and hence cost, of the conferences prohibited externalization and so the CERN telecommunications team drew up a specification to implement a new system. It was decided to use a new commercial collaborative audio-conference solution based on the SIP protocol. The system was tested as the first European pilot and several improvements (such as billing, security, redundancy...) were implemented based on CERN's recommendations. The new automatic conference system has been operational since the second half of 2006. It is very popular for the users and has doubled the number of conferences in the past two years.

  2. CERN automatic audio-conference service

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Moral, Rodrigo, E-mail: Rodrigo.Sierra@cern.c [CERN, IT Department 1211 Geneva-23 (Switzerland)

    2010-04-01

    Scientists from all over the world need to collaborate with CERN on a daily basis. They must be able to communicate effectively on their joint projects at any time; as a result telephone conferences have become indispensable and widely used. Managed by 6 operators, CERN already has more than 20000 hours and 5700 audio-conferences per year. However, the traditional telephone based audio-conference system needed to be modernized in three ways. Firstly, to provide the participants with more autonomy in the organization of their conferences; secondly, to eliminate the constraints of manual intervention by operators; and thirdly, to integrate the audio-conferences into a collaborative working framework. The large number, and hence cost, of the conferences prohibited externalization and so the CERN telecommunications team drew up a specification to implement a new system. It was decided to use a new commercial collaborative audio-conference solution based on the SIP protocol. The system was tested as the first European pilot and several improvements (such as billing, security, redundancy...) were implemented based on CERN's recommendations. The new automatic conference system has been operational since the second half of 2006. It is very popular for the users and has doubled the number of conferences in the past two years.

  3. CERN automatic audio-conference service

    Science.gov (United States)

    Sierra Moral, Rodrigo

    2010-04-01

    Scientists from all over the world need to collaborate with CERN on a daily basis. They must be able to communicate effectively on their joint projects at any time; as a result telephone conferences have become indispensable and widely used. Managed by 6 operators, CERN already has more than 20000 hours and 5700 audio-conferences per year. However, the traditional telephone based audio-conference system needed to be modernized in three ways. Firstly, to provide the participants with more autonomy in the organization of their conferences; secondly, to eliminate the constraints of manual intervention by operators; and thirdly, to integrate the audio-conferences into a collaborative working framework. The large number, and hence cost, of the conferences prohibited externalization and so the CERN telecommunications team drew up a specification to implement a new system. It was decided to use a new commercial collaborative audio-conference solution based on the SIP protocol. The system was tested as the first European pilot and several improvements (such as billing, security, redundancy...) were implemented based on CERN's recommendations. The new automatic conference system has been operational since the second half of 2006. It is very popular for the users and has doubled the number of conferences in the past two years.

  4. Automatic Deduction in Dynamic Geometry using Sage

    Directory of Open Access Journals (Sweden)

    Francisco Botana

    2012-02-01

    Full Text Available We present a symbolic tool that provides robust algebraic methods to handle automatic deduction tasks for a dynamic geometry construction. The main prototype has been developed as two different worksheets for the open source computer algebra system Sage, corresponding to two different ways of coding a geometric construction. In one worksheet, diagrams constructed with the open source dynamic geometry system GeoGebra are accepted. In this worksheet, Groebner bases are used to either compute the equation of a geometric locus in the case of a locus construction or to determine the truth of a general geometric statement included in the GeoGebra construction as a boolean variable. In the second worksheet, locus constructions coded using the common file format for dynamic geometry developed by the Intergeo project are accepted for computation. The prototype and several examples are provided for testing. Moreover, a third Sage worksheet is presented in which a novel algorithm to eliminate extraneous parts in symbolically computed loci has been implemented. The algorithm, based on a recent work on the Groebner cover of parametric systems, identifies degenerate components and extraneous adherence points in loci, both natural byproducts of general polynomial algebraic methods. Detailed examples are discussed.

  5. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  6. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  7. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  8. Automatic Detection of Terminology Evolution

    Science.gov (United States)

    Tahmasebi, Nina

    As archives contain documents that span over a long period of time, the language used to create these documents and the language used for querying the archive can differ. This difference is due to evolution in both terminology and semantics and will cause a significant number of relevant documents being omitted. A static solution is to use query expansion based on explicit knowledge banks such as thesauri or ontologies. However as we are able to archive resources with more varied terminology, it will be infeasible to use only explicit knowledge for this purpose. There exist only few or no thesauri covering very domain specific terminologies or slang as used in blogs etc. In this Ph.D. thesis we focus on automatically detecting terminology evolution in a completely unsupervised manner as described in this technical paper.

  9. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  10. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  11. Automatic Regulation of Wastewater Discharge

    Directory of Open Access Journals (Sweden)

    Bolea Yolanda

    2017-01-01

    Full Text Available Wastewater plants, mainly with secondary treatments, discharge polluted water to environment that cannot be used in any human activity. When those dumps are in the sea it is expected that most of the biological pollutants die or almost disappear before water reaches human range. This natural withdrawal of bacteria, viruses and other pathogens is due to some conditions such as the salt water of the sea and the sun effect, and the dumps areas are calculated taking into account these conditions. However, under certain meteorological phenomena water arrives to the coast without the full disappearance of pollutant elements. In Mediterranean Sea there are some periods of adverse climatic conditions that pollute the coast near the wastewater dumping. In this paper, authors present an automatic control that prevents such pollution episodes using two mathematical models, one for the pollutant transportation and the other for the pollutant removal in wastewater spills.

  12. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  13. a Method for the Seamlines Network Automatic Selection Based on Building Vector

    Science.gov (United States)

    Li, P.; Dong, Y.; Hu, Y.; Li, X.; Tan, P.

    2018-04-01

    In order to improve the efficiency of large scale orthophoto production of city, this paper presents a method for automatic selection of seamlines network in large scale orthophoto based on the buildings' vector. Firstly, a simple model of the building is built by combining building's vector, height and DEM, and the imaging area of the building on single DOM is obtained. Then, the initial Voronoi network of the measurement area is automatically generated based on the positions of the bottom of all images. Finally, the final seamlines network is obtained by optimizing all nodes and seamlines in the network automatically based on the imaging areas of the buildings. The experimental results show that the proposed method can not only get around the building seamlines network quickly, but also remain the Voronoi network' characteristics of projection distortion minimum theory, which can solve the problem of automatic selection of orthophoto seamlines network in image mosaicking effectively.

  14. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  15. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  16. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  17. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  18. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  19. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  20. Prediction of Pseudo relative velocity response spectra at Yucca Mountain for underground nuclear explosions conducted in the Pahute Mesa testing area at the Nevada testing site; Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, J.S.

    1991-12-01

    The Yucca Mountain Site Characterization Project (YMP), managed by the Office of Geologic Disposal of the Office of Civilian Radioactive Waste Management of the US Department of Energy, is examining the feasibility of siting a repository for commercial, high-level nuclear wastes at Yucca Mountain on and adjacent to the Nevada Test Site (NTS). This work, intended to extend our understanding of the ground motion at Yucca Mountain resulting from testing of nuclear weapons on the NTS, was funded by the Yucca Mountain project and the Military Applications Weapons Test Program. This report summarizes one aspect of the weapons test seismic investigations conducted in FY88. Pseudo relative velocity response spectra (PSRV) have been calculated for a large body of surface ground motions generated by underground nuclear explosions. These spectra have been analyzed and fit using multiple linear regression techniques to develop a credible prediction technique for surface PSRVs. In addition, a technique for estimating downhole PSRVs at specific stations is included. A data summary, data analysis, prediction development, prediction evaluation, software summary and FORTRAN listing of the prediction technique are included in this report.

  1. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    Science.gov (United States)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  2. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    International Nuclear Information System (INIS)

    Sharifi, Hamid; Larouche, Daniel

    2015-01-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium–copper alloy (Al–5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie–Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected. (paper)

  3. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    Science.gov (United States)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  4. Control System Design for Automatic Cavity Tuning Machines

    Energy Technology Data Exchange (ETDEWEB)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; /Fermilab; Goessel, A.; Iversen, J.; Klinke, D.; /DESY

    2009-05-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  5. Control System Design for Automatic Cavity Tuning Machines

    International Nuclear Information System (INIS)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; Goessel, A.; Iversen, J.; Klinke, D.

    2009-01-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  6. Device for the automatic evaluation of pencil dosimeters

    International Nuclear Information System (INIS)

    Schallopp, B.

    1976-01-01

    In connenction with the automation of radiation protection in nuclear power plants, an automatic reading device has been developed for the direct input of the readings of pencil dosimeters into a computer. Voltage measurements would be simple but are excluded, because the internal electrode of the dosimeter may not be touched, for operational reasons. This paper describes an optical/electronic conversion device in which the reading of the dosimeter is projected onto a Vidicon, scanned, and converted into a digital signal for output to the computer. (orig.) [de

  7. Implementation of a microcontroller-based semi-automatic coagulator.

    Science.gov (United States)

    Chan, K; Kirumira, A; Elkateeb, A

    2001-01-01

    The coagulator is an instrument used in hospitals to detect clot formation as a function of time. Generally, these coagulators are very expensive and therefore not affordable by a doctors' office and small clinics. The objective of this project is to design and implement a low cost semi-automatic coagulator (SAC) prototype. The SAC is capable of assaying up to 12 samples and can perform the following tests: prothrombin time (PT), activated partial thromboplastin time (APTT), and PT/APTT combination. The prototype has been tested successfully.

  8. Automatic Adviser on Mobile Objects Status Identification and Classification

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Saryan, A. S.

    2018-05-01

    A mobile object status identification task is defined within the image discrimination theory. It is proposed to classify objects into three classes: object operation status; its maintenance is required and object should be removed from the production process. Two methods were developed to construct the separating boundaries between the designated classes: a) using statistical information on the research objects executed movement, b) basing on regulatory documents and expert commentary. Automatic Adviser operation simulation and the operation results analysis complex were synthesized. Research results are commented using a specific example of cuts rolling from the hump yard. The work was supported by Russian Fundamental Research Fund, project No. 17-20-01040.

  9. A web based semi automatic frame work for astrobiological researches

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Astrobiology addresses the possibility of extraterrestrial life and explores measures towards its recognition. Researches in this context are founded upon the premise that indicators of life encountered in space will be recognizable. However, effective recognition can be accomplished through a universal adaptation of life signatures without restricting solely to those attributes that represent local solutions to the challenges of survival. The life indicators should be modelled with reference to temporal and environmental variations specific to each planet and time. In this paper, we investigate a semi-automatic open source frame work for the accurate detection and interpretation of life signatures by facilitating public participation, in a similar way as adopted by SETI@home project. The involvement of public in identifying patterns can bring a thrust to the mission and is implemented using semi-automatic framework. Different advanced intelligent methodologies may augment the integration of this human machine analysis. Automatic and manual evaluations along with dynamic learning strategy have been adopted to provide accurate results. The system also helps to provide a deep public understanding about space agency’s works and facilitate a mass involvement in the astrobiological studies. It will surely help to motivate young eager minds to pursue a career in this field.

  10. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Science.gov (United States)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  11. Climate prediction and predictability

    Science.gov (United States)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  12. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  13. Automatic calibration of gamma spectrometers

    International Nuclear Information System (INIS)

    Tluchor, D.; Jiranek, V.

    1989-01-01

    The principle is described of energy calibration of the spectrometric path based on the measurement of the standard of one radionuclide or a set of them. The entire computer-aided process is divided into three main steps, viz.: the insertion of the calibration standard by the operator; the start of the calibration program; energy calibration by the computer. The program was selected such that the spectrum identification should not depend on adjustment of the digital or analog elements of the gamma spectrometric measuring path. The ECL program is described for automatic energy calibration as is its control, the organization of data file ECL.DAT and the necessary hardware support. The computer-multichannel analyzer communication was provided using an interface pair of Canberra 8673V and Canberra 8573 operating in the RS-422 standard. All subroutines for communication with the multichannel analyzer were written in MACRO 11 while the main program and the other subroutines were written in FORTRAN-77. (E.J.). 1 tab., 4 refs

  14. Automatic locking orthotic knee device

    Science.gov (United States)

    Weddendorf, Bruce C. (Inventor)

    1993-01-01

    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  15. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  16. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  17. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  18. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  20. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  1. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  2. Techniques for Automatic Creation of Terrain Databases for Training and Mission Preparation

    NARCIS (Netherlands)

    Kuijper, F.; Son, R. van; Meurs, F. van; Smelik, R.M.; Kraker, J.K. de

    2010-01-01

    In the support of defense agencies and civil authorities TNO runs a research program that strives after automatic generation of terrain databases for a variety of simulation applications. Earlier papers by TNO at the IMAGE conference have reported in-depth on specific projects within this program.

  3. Automatized distribution systems in IBERDROLA. Sistemas de automatizacion de distribucion en Iberdrola

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Madariaga, J.A.

    1994-01-01

    This article presents the automatized distribution systems in IBERDROLA. These systems permit to improve the Energetical demand management. The optimized distribution system is a applied by the industrial sector and by the small users. Iberdrola has developed a project in order to offer the telemanagement to the energy users.

  4. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    Science.gov (United States)

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  5. Understanding Applications of Project Planning and Scheduling in Construction Projects

    OpenAIRE

    AlNasseri, Hammad Abdullah

    2015-01-01

    Construction project life-cycle processes must be managed in a more effective and predictable way to meet project stakeholders’ needs. However, there is increasing concern about whether know-how effectively improves understanding of underlying theories of project management processes for construction organizations and their project managers. Project planning and scheduling are considered as key and challenging tools in controlling and monitoring project performance, but many worldwide constru...

  6. Realization in 2007 of the complex project on creation of the thematic atlas modern and prediction aspects of consequences from Chernobyl NPP failure

    International Nuclear Information System (INIS)

    Poplyko, I.Ya.; Nikolaenko, E.V.

    2008-01-01

    The thematic atlas modern and prediction aspects of consequences from Chernobyl NPP failure in the suffered territories of Russia and Belarus will contain materials about radioactive pollution of territories, the grounds of wood fund, the agricultural grounds, and as help section, including analytical and help materials. (authors)

  7. Computer vision for automatic inspection of agricultural produce

    Science.gov (United States)

    Molto, Enrique; Blasco, Jose; Benlloch, Jose V.

    1999-01-01

    Fruit and vegetables suffer different manipulations from the field to the final consumer. These are basically oriented towards the cleaning and selection of the product in homogeneous categories. For this reason, several research projects, aimed at fast, adequate produce sorting and quality control are currently under development around the world. Moreover, it is possible to find manual and semi- automatic commercial system capable of reasonably performing these tasks.However, in many cases, their accuracy is incompatible with current European market demands, which are constantly increasing. IVIA, the Valencian Research Institute of Agriculture, located in Spain, has been involved in several European projects related with machine vision for real-time inspection of various agricultural produces. This paper will focus on the work related with two products that have different requirements: fruit and olives. In the case of fruit, the Institute has developed a vision system capable of providing assessment of the external quality of single fruit to a robot that also receives information from other senors. The system use four different views of each fruit and has been tested on peaches, apples and citrus. Processing time of each image is under 500 ms using a conventional PC. The system provides information about primary and secondary color, blemishes and their extension, and stem presence and position, which allows further automatic orientation of the fruit in the final box using a robotic manipulator. Work carried out in olives was devoted to fast sorting of olives for consumption at table. A prototype has been developed to demonstrate the feasibility of a machine vision system capable of automatically sorting 2500 kg/h olives using low-cost conventional hardware.

  8. Comparison of machine learning techniques to predict all-cause mortality using fitness data: the Henry ford exercIse testing (FIT) project.

    Science.gov (United States)

    Sakr, Sherif; Elshawi, Radwa; Ahmed, Amjad M; Qureshi, Waqas T; Brawner, Clinton A; Keteyian, Steven J; Blaha, Michael J; Al-Mallah, Mouaz H

    2017-12-19

    Prior studies have demonstrated that cardiorespiratory fitness (CRF) is a strong marker of cardiovascular health. Machine learning (ML) can enhance the prediction of outcomes through classification techniques that classify the data into predetermined categories. The aim of this study is to present an evaluation and comparison of how machine learning techniques can be applied on medical records of cardiorespiratory fitness and how the various techniques differ in terms of capabilities of predicting medical outcomes (e.g. mortality). We use data of 34,212 patients free of known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems Between 1991 and 2009 and had a complete 10-year follow-up. Seven machine learning classification techniques were evaluated: Decision Tree (DT), Support Vector Machine (SVM), Artificial Neural Networks (ANN), Naïve Bayesian Classifier (BC), Bayesian Network (BN), K-Nearest Neighbor (KNN) and Random Forest (RF). In order to handle the imbalanced dataset used, the Synthetic Minority Over-Sampling Technique (SMOTE) is used. Two set of experiments have been conducted with and without the SMOTE sampling technique. On average over different evaluation metrics, SVM Classifier has shown the lowest performance while other models like BN, BC and DT performed better. The RF classifier has shown the best performance (AUC = 0.97) among all models trained using the SMOTE sampling. The results show that various ML techniques can significantly vary in terms of its performance for the different evaluation metrics. It is also not necessarily that the more complex the ML model, the more prediction accuracy can be achieved. The prediction performance of all models trained with SMOTE is much better than the performance of models trained without SMOTE. The study shows the potential of machine learning methods for predicting all-cause mortality using cardiorespiratory fitness

  9. Automatic Blood Pressure Measurements During Exercise

    Science.gov (United States)

    Weaver, Charles S.

    1985-01-01

    Microprocessor circuits and a computer algorithm for automatically measuring blood pressure during ambulatory monitoring and exercise stress testing have been under development at SRI International. A system that records ECG, Korotkov sound, and arm cuff pressure for off-line calculation of blood pressure has been delivered to NASA, and an LSLE physiological monitoring system that performs the algorithm calculations in real-time is being constructed. The algorithm measures the time between the R-wave peaks and the corresponding Korotkov sound on-set (RK-interval). Since the curve of RK-interval versus cuff pressure during deflation is predictable and slowly varying, windows can be set around the curve to eliminate false Korotkov sound detections that result from noise. The slope of this curve, which will generally decrease during exercise, is the inverse of the systolic slope of the brachial artery pulse. In measurements taken during treadmill stress testing, the changes in slopes of subjects with coronary artery disease were markedly different from the changes in slopes of healthy subjects. Measurements of slope and O2 consumption were also made before and after ten days of bed rest during NASA/Ames Research Center bed rest studies. Typically, the maximum rate of O2 consumption during the post-bed rest test is less than the maximum rate during the pre-bed rest test. The post-bed rest slope changes differ from the pre-bed rest slope changes, and the differences are highly correlated with the drop in the maximum rate of O2 consumption. We speculate that the differences between pre- and post-bed rest slopes are due to a drop in heart contractility.

  10. Development of a chain limber and its measuring automatics; Karsimakoneen ja sen mittausautomatiikan kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Poeytaesaari, E [Eskon Paja, Kinnula (Finland)

    1997-12-01

    A new control system and measuring automatics are developed for a patented chain limber mountable to a farm tractor. The chain limber produces pulp wood and also limbed fuel logs. The project will be carried out in three stages: definition of the control system, development of the control system, and operational testing of the control system and the chain limber. The final stage of the project will be carried out in co-operation with the Work Efficiency Association. (orig.)

  11. An application of artificial intelligence to automatic telescopes

    Science.gov (United States)

    Swanson, Keith; Drummond, Mark; Bresina, John

    1992-01-01

    Automatic Photoelectric Telescopes (APT's) allow an astronomer to be removed form the telescope site in both time and space. APT's 'execute' an observation program (a set of observation requests) expressed in an ASCII-based language (ATIS) and collect observation results expressed in this same language. The observation program is currently constructed by a Principal Astronomer from the requests of multiple users; the execution is currently controlled by a simple heuristic dispatch scheduler. Research aimed at improving the use of APT's is being carried out by the Entropy Reduction Engine (ERE) project at NASA Ames. The overall goal of the ERE project is the study and construction of systems that integrate planning, scheduling, and control. This paper discusses the application of some ERE technical results to the improvement of both the scheduling and the operation of APT's.

  12. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  13. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  14. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  15. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers),

  16. Automatic Amharic text news classification: Aneural networks ...

    African Journals Online (AJOL)

    School of Computing and Electrical Engineering, Institute of Technology, Bahir Dar University, Bahir Dar ... The study is on classification of Amharic news automatically using neural networks approach. Learning Vector ... INTRODUCTION.

  17. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  18. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  19. Automatic shadowing device for electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, F W; Bogitch, S

    1960-01-01

    For the past ten years in the laboratory of the Department of Nuclear Medicine and Radiation Biology at the University of California, and before that at Rochester, New York, every evaporation was done with the aid of an automatic shadowing device. For several months the automatic shadowing device has been available at the Atomic Bomb Casualty Commission (ABCC) Hiroshima, Japan with the modifications described. 1 reference.

  20. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  1. Automatic Control of Silicon Melt Level

    Science.gov (United States)

    Duncan, C. S.; Stickel, W. B.

    1982-01-01

    A new circuit, when combined with melt-replenishment system and melt level sensor, offers continuous closed-loop automatic control of melt-level during web growth. Installed on silicon-web furnace, circuit controls melt-level to within 0.1 mm for as long as 8 hours. Circuit affords greater area growth rate and higher web quality, automatic melt-level control also allows semiautomatic growth of web over long periods which can greatly reduce costs.

  2. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  3. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  4. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  5. Automatic Vetting for Malice in Android Platforms

    Science.gov (United States)

    2016-05-01

    Android Apps from Play Store Infected with Brain Test Malware. http://www.ibtimes.co.uk/google- removes -13- android -apps-play-store-infected- brain-test...AUTOMATIC VETTING FOR MALICE IN ANDROID PLATFORMS IOWA STATE UNIVERSITY MAY 2016 FINAL TECHNICAL REPORT APPROVED...COVERED (From - To) DEC 2013 - DEC 2015 4. TITLE AND SUBTITLE Automatic Vetting for Malice in Android Platforms 5a. CONTRACT NUMBER FA8750-14-2

  6. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  7. Automatic correspondence detection in mammogram and breast tomosynthesis images

    Science.gov (United States)

    Ehrhardt, Jan; Krüger, Julia; Bischof, Arpad; Barkhausen, Jörg; Handels, Heinz

    2012-02-01

    Two-dimensional mammography is the major imaging modality in breast cancer detection. A disadvantage of mammography is the projective nature of this imaging technique. Tomosynthesis is an attractive modality with the potential to combine the high contrast and high resolution of digital mammography with the advantages of 3D imaging. In order to facilitate diagnostics and treatment in the current clinical work-flow, correspondences between tomosynthesis images and previous mammographic exams of the same women have to be determined. In this paper, we propose a method to detect correspondences in 2D mammograms and 3D tomosynthesis images automatically. In general, this 2D/3D correspondence problem is ill-posed, because a point in the 2D mammogram corresponds to a line in the 3D tomosynthesis image. The goal of our method is to detect the "most probable" 3D position in the tomosynthesis images corresponding to a selected point in the 2D mammogram. We present two alternative approaches to solve this 2D/3D correspondence problem: a 2D/3D registration method and a 2D/2D mapping between mammogram and tomosynthesis projection images with a following back projection. The advantages and limitations of both approaches are discussed and the performance of the methods is evaluated qualitatively and quantitatively using a software phantom and clinical breast image data. Although the proposed 2D/3D registration method can compensate for moderate breast deformations caused by different breast compressions, this approach is not suitable for clinical tomosynthesis data due to the limited resolution and blurring effects perpendicular to the direction of projection. The quantitative results show that the proposed 2D/2D mapping method is capable of detecting corresponding positions in mammograms and tomosynthesis images automatically for 61 out of 65 landmarks. The proposed method can facilitate diagnosis, visual inspection and comparison of 2D mammograms and 3D tomosynthesis images for

  8. Study of an automatic dosing of neptunium in the industrial process of separation neptunium 237-plutonium 238

    International Nuclear Information System (INIS)

    Ros, Pierre

    1973-01-01

    The objective is to study and to adapt a method of automatic dosing of neptunium to the industrial process of separation and purification of plutonium 238, while taking the information quality and economic aspects into account. After a recall of some generalities on the production of plutonium 238, and the process of separation plutonium-neptunium, the author addresses the dosing of neptunium. The adopted measurement technique is spectrophotometry (of neptunium, of neptunium peroxide) which is the most flexible and economic to adapt to automatic control. The author proposes a project of chemical automatic machine, and discusses the complex (stoichiometry, form) and some aspects of neptunium dosing (redox reactions, process control) [fr

  9. Automatic and strategic effects in the guidance of attention by working memory representations.

    Science.gov (United States)

    Carlisle, Nancy B; Woodman, Geoffrey F

    2011-06-01

    Theories of visual attention suggest that working memory representations automatically guide attention toward memory-matching objects. Some empirical tests of this prediction have produced results consistent with working memory automatically guiding attention. However, others have shown that individuals can strategically control whether working memory representations guide visual attention. Previous studies have not independently measured automatic and strategic contributions to the interactions between working memory and attention. In this study, we used a classic manipulation of the probability of valid, neutral, and invalid cues to tease apart the nature of such interactions. This framework utilizes measures of reaction time (RT) to quantify the costs and benefits of attending to memory-matching items and infer the relative magnitudes of automatic and strategic effects. We found both costs and benefits even when the memory-matching item was no more likely to be the target than other items, indicating an automatic component of attentional guidance. However, the costs and benefits essentially doubled as the probability of a trial with a valid cue increased from 20% to 80%, demonstrating a potent strategic effect. We also show that the instructions given to participants led to a significant change in guidance distinct from the actual probability of events during the experiment. Together, these findings demonstrate that the influence of working memory representations on attention is driven by both automatic and strategic interactions. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  11. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  12. Automatic three-dimensional model for protontherapy of the eye: Preliminary results

    International Nuclear Information System (INIS)

    Bondiau, Pierre-Yves; Malandain, Gregoire; Chauvel, Pierre; Peyrade, Frederique; Courdi, Adel; Iborra, Nicole; Caujolle, Jean-Pierre; Gastaud, Pierre

    2003-01-01

    Recently, radiotherapy possibilities have been dramatically increased by software and hardware developments. Improvements in medical imaging devices have increased the importance of three-dimensional (3D) images as the complete examination of these data by a physician is not possible. Computer techniques are needed to present only the pertinent information for clinical applications. We describe a technique for an automatic 3D reconstruction of the eye and CT scan merging with fundus photographs (retinography). The final result is a 'virtual eye' to guide ocular tumor protontherapy. First, we make specific software to automatically detect the position of the eyeball, the optical nerve, and the lens in the CT scan. We obtain a 3D eye reconstruction using this automatic method. Second, we describe the retinography and demonstrate the projection of this modality. Then we combine retinography with a reconstructed eye, using a CT scan to get a virtual eye. The result is a computer 3D scene rendering a virtual eye into a skull reconstruction. The virtual eye can be useful for the simulation, the planning, and the control of ocular tumor protontherapy. It can be adapted to treatment planning to automatically detect eye and organs at risk position. It should be highlighted that all the image processing is fully automatic to allow the reproduction of results, this is a useful property to conduct a consistent clinical validation. The automatic localization of the organ at risk in a CT scan or an MRI by automatic software could be of great interest for radiotherapy in the future for comparison of one patient at different times, the comparison of different treatments centers, the possibility of pooling results of different treatments centers, the automatic generation of doses-volumes histograms, the comparison between different treatment planning for the same patient and the comparison between different patients at the same time. It will also be less time consuming

  13. First Steps Towards the Automatic Construction of Argument-Diagrams from Real Discussions

    NARCIS (Netherlands)

    Verbree, Daan; Rienks, R.J.; Heylen, Dirk K.J.; Dunne, P.; Bench-Capon, T.J.E.

    This paper presents our efforts to create argument structures from meeting transcripts automatically. We show that unit labels of argument diagrams can be learnt and predicted by a computer with an accuracy of 78,52% and 51,43% on an unbalanced and balanced set respectively. We used a corpus of over

  14. Identification with video game characters as automatic shift of self-perceptions

    NARCIS (Netherlands)

    Klimmt, C.; Hefner, D.; Vorderer, P.A.; Roth, C.; Blake, C.

    2010-01-01

    Two experiments tested the prediction that video game players identify with the character or role they are assigned, which leads to automatic shifts in implicit self-perceptions. Video game identification, thus, is considered as a kind of altered self-experience. In Study 1 (N = 61), participants

  15. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  16. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    Science.gov (United States)

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks.

  17. SISCAL project

    Science.gov (United States)

    Santer, Richard P.; Fell, Frank

    2003-05-01

    The first "ocean colour" sensor, Coastal Zone Color Scanner (CZCS), was launched in 1978. Oceanographers learnt a lot from CZCS but it remained a purely scientific sensor. In recent years, a new generation of satellite-borne earth observation (EO) instruments has been brought into space. These instruments combine high spectral and spatial resolution with revisiting rates of the order of one per day. More instruments with further increased spatial, spectral and temporal resolution will be available within the next years. In the meantime, evaluation procedures taking advantage of the capabilities of the new instruments were derived, allowing the retrieval of ecologically important parameters with higher accuracy than before. Space agencies are now able to collect and to process satellite data in real time and to disseminate them via the Internet. It is therefore meanwhile possible to envisage using EO operationally. In principle, a significant demand for EO data products on terrestrial or marine ecosystems exists both with public authorities (environmental protection, emergency management, natural resources management, national parks, regional planning, etc) and private companies (tourist industry, insurance companies, water suppliers, etc). However, for a number of reasons, many data products that can be derived from the new instruments and methods have not yet left the scientific community towards public or private end users. It is the intention of the proposed SISCAL (Satellite-based Information System on Coastal Areas and Lakes) project to contribute to the closure of the existing gap between space agencies and research institutions on one side and end users on the other side. To do so, we intend to create a data processor that automatically derives and subsequently delivers over the Internet, in Near-Real-Time (NRT), a number of data products tailored to individual end user needs. The data products will be generated using a Geographical Information System (GIS

  18. Clinical performance of a new hepatitis B surface antigen quantitative assay with automatic dilution

    Directory of Open Access Journals (Sweden)

    Ta-Wei Liu

    2015-01-01

    Full Text Available Hepatitis B virus surface antigen (HBsAg levels reflect disease status and can predict the clinical response to antiviral treatment; however, the emergence of HBsAg mutant strains has become a challenge. The Abbott HBsAg quantification assay provides enhanced detection of HBsAg and HBsAg mutants. We aimed to evaluate the performance of the Abbott HBsAg quantification assay with automatic sample dilutions (shortened as automatic Architect assay, compared with the Abbott HBsAg quantification assay with manual sample dilutions (shortened as manual Architect assay and the Roche HBsAg quantification assay with automatic sample dilutions (shortened as Elecsys. A total of 130 sera samples obtained from 87 hepatitis B virus (HBV-infected patients were collected to assess the correlation between the automatic and manual Architect assays. Among the 87 patients, 41 provided 42 sera samples to confirm the linearity and reproducibility of the automatic Architect assay, and find out the correlation among the Elecsys and two Architect assays. The coefficients of variation (0.44–9.53% and R2 = 0.996–1, which were both determined using values obtained from the automatic Architect assay, showed good reproducibility and linearity. Results of the two Architect assays demonstrated a feasible correlation (n = 130 samples; R = 0.898, p  0.93 in all cases. In conclusion, the correlation between the automatic and manual dilution Architect assays was feasible, particularly in the HBeAg-negative and low DNA groups. With lower labor costs and less human error than the manual version, the Abbott automatic dilution Architect assay provided a good clinical performance with regard to the HBsAg levels.

  19. Development of mechanical-hydraulic models for the prediction of the long-term sealing capacity of concrete based sealing materials in rock salt. Project Titel LASA

    Energy Technology Data Exchange (ETDEWEB)

    Czaikowski, Oliver; Dittrich, Juergen; Hertes, Uwe; Jantschik, Kyra; Wieczorek, Klaus; Zehle, Bernd

    2016-08-15

    The research work leading to these results has received funding from the German Federal Ministry of Economic Affairs and Energy (BMWi) under contract no. 02E11132. This report presents the current state of laboratory investigations and modelling activities related to the LASA project. The work is related to the research and development of plugging and sealing for repositories in salt rock and is of fundamental importance for the salt option which represents one of the three European repository options in addition to the clay rock and the crystalline rock options.

  20. Use of Cumulative Degradation Factor Prediction and Life Test Result of the Thruster Gimbal Assembly Actuator for the Dawn Flight Project

    Science.gov (United States)

    Lo, C. John; Brophy, John R.; Etters, M. Andy; Ramesham, Rajeshuni; Jones, William R., Jr.; Jansen, Mark J.

    2009-01-01

    The Dawn Ion Propulsion System is the ninth project in NASA s Discovery Program. The Dawn spacecraft is being developed to enable the scientific investigation of the two heaviest main-belt asteroids, Vesta and Ceres. Dawn is the first mission to orbit two extraterrestrial bodies, and the first to orbit a main-belt asteroid. The mission is enabled by the onboard Ion Propulsion System (IPS) to provide the post-launch delta-V. The three Ion Engines of the IPS are mounted on Thruster Gimbal Assembly (TGA), with only one engine operating at a time for this 10-year mission. The three TGAs weigh 14.6 kg.

  1. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    Regional hydrological models are important tools in water resources management. Model prediction uncertainty is primarily due to structural (geological) non-uniqueness which makes sampling of the structural model space necessary to estimate prediction uncertainties. Geological structures and hete...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  2. Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm

    KAUST Repository

    Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin

    2013-01-01

    A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into p-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. © 2013

  3. Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm

    KAUST Repository

    Abbas, Ahmed

    2013-01-07

    A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into p-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. © 2013

  4. Automatically rating trainee skill at a pediatric laparoscopic suturing task.

    Science.gov (United States)

    Oquendo, Yousi A; Riddle, Elijah W; Hiller, Dennis; Blinman, Thane A; Kuchenbecker, Katherine J

    2018-04-01

    Minimally invasive surgeons must acquire complex technical skills while minimizing patient risk, a challenge that is magnified in pediatric surgery. Trainees need realistic practice with frequent detailed feedback, but human grading is tedious and subjective. We aim to validate a novel motion-tracking system and algorithms that automatically evaluate trainee performance of a pediatric laparoscopic suturing task. Subjects (n = 32) ranging from medical students to fellows performed two trials of intracorporeal suturing in a custom pediatric laparoscopic box trainer after watching a video of ideal performance. The motions of the tools and endoscope were recorded over time using a magnetic sensing system, and both tool grip angles were recorded using handle-mounted flex sensors. An expert rated the 63 trial videos on five domains from the Objective Structured Assessment of Technical Skill (OSATS), yielding summed scores from 5 to 20. Motion data from each trial were processed to calculate 280 features. We used regularized least squares regression to identify the most predictive features from different subsets of the motion data and then built six regression tree models that predict summed OSATS score. Model accuracy was evaluated via leave-one-subject-out cross-validation. The model that used all sensor data streams performed best, achieving 71% accuracy at predicting summed scores within 2 points, 89% accuracy within 4, and a correlation of 0.85 with human ratings. 59% of the rounded average OSATS score predictions were perfect, and 100% were within 1 point. This model employed 87 features, including none based on completion time, 77 from tool tip motion, 3 from tool tip visibility, and 7 from grip angle. Our novel hardware and software automatically rated previously unseen trials with summed OSATS scores that closely match human expert ratings. Such a system facilitates more feedback-intensive surgical training and may yield insights into the fundamental

  5. Automatic learning-based beam angle selection for thoracic IMRT

    International Nuclear Information System (INIS)

    Amit, Guy; Marshall, Andrea; Purdie, Thomas G.; Jaffray, David A.; Levinshtein, Alex; Hope, Andrew J.; Lindsay, Patricia; Pekar, Vladimir

    2015-01-01

    Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume

  6. Automatic figure ranking and user interfacing for intelligent figure search.

    Directory of Open Access Journals (Sweden)

    Hong Yu

    2010-10-01

    Full Text Available Figures are important experimental results that are typically reported in full-text bioscience articles. Bioscience researchers need to access figures to validate research facts and to formulate or to test novel research hypotheses. On the other hand, the sheer volume of bioscience literature has made it difficult to access figures. Therefore, we are developing an intelligent figure search engine (http://figuresearch.askhermes.org. Existing research in figure search treats each figure equally, but we introduce a novel concept of "figure ranking": figures appearing in a full-text biomedical article can be ranked by their contribution to the knowledge discovery.We empirically validated the hypothesis of figure ranking with over 100 bioscience researchers, and then developed unsupervised natural language processing (NLP approaches to automatically rank figures. Evaluating on a collection of 202 full-text articles in which authors have ranked the figures based on importance, our best system achieved a weighted error rate of 0.2, which is significantly better than several other baseline systems we explored. We further explored a user interfacing application in which we built novel user interfaces (UIs incorporating figure ranking, allowing bioscience researchers to efficiently access important figures. Our evaluation results show that 92% of the bioscience researchers prefer as the top two choices the user interfaces in which the most important figures are enlarged. With our automatic figure ranking NLP system, bioscience researchers preferred the UIs in which the most important figures were predicted by our NLP system than the UIs in which the most important figures were randomly assigned. In addition, our results show that there was no statistical difference in bioscience researchers' preference in the UIs generated by automatic figure ranking and UIs by human ranking annotation.The evaluation results conclude that automatic figure ranking and user

  7. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  8. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  9. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  10. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  11. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  12. The Automatic Test Features of the IDiPS Reactor Protection System

    International Nuclear Information System (INIS)

    Hur, Seop; Kim, Dong-Hoon; Hwang, In-Koo; Lee, Cheol-Kwon; Lee, Dong-Young

    2007-01-01

    The reactor protection system (RPS) is designed to minimize a propagation of abnormal or accident conditions of nuclear power plants. A digital RPS (Integrated Digital Protection System (IDiPS) RPS) is being developed in the Korea Nuclear Instrumentation and Control System (KNICS) R and D project. To make good use of the advantages of the digital technology, it is necessary to improve the reliability and availability of a system through automatic test features including an on-line testing, a self-diagnostics, an auto calibration, etc. This paper summarizes the system test strategy and the automatic test features of the IDiPS RPS

  13. Predictive Values of the New Sarcopenia Index by the Foundation for the National Institutes of Health Sarcopenia Project for Mortality among Older Korean Adults

    Science.gov (United States)

    Kim, Jung Hee; Moon, Jae Hoon; Choi, Sung Hee; Lim, Soo; Lim, Jae-Young; Kim, Ki Woong; Park, Kyong Soo; Jang, Hak Chul

    2016-01-01

    Objective We evaluated the Foundation for the National Institutes of Health (FNIH) Sarcopenia Project’s recommended criteria for sarcopenia’s association with mortality among older Korean adults. Methods We conducted a community-based prospective cohort study which included 560 (285 men and 275 women) older Korean adults aged ≥65 years. Muscle mass (appendicular skeletal muscle mass-to-body mass index ratio (ASM/BMI)), handgrip strength, and walking velocity were evaluated in association with all-cause mortality during 6-year follow-up. Both the lowest quintile for each parameter (ethnic-specific cutoff) and FNIH-recommended values were used as cutoffs. Results Forty men (14.0%) and 21 women (7.6%) died during 6-year follow-up. The deceased subjects were older and had lower ASM, handgrip strength, and walking velocity. Sarcopenia defined by both low lean mass and weakness had a 4.13 (95% CI, 1.69–10.11) times higher risk of death, and sarcopenia defined by a combination of low lean mass, weakness, and slowness had a 9.56 (3.16–28.90) times higher risk of death after adjusting for covariates in men. However, these significant associations were not observed in women. In terms of cutoffs of each parameter, using the lowest quintile showed better predictive values in mortality than using the FNIH-recommended values. Moreover, new muscle mass index, ASM/BMI, provided better prognostic values than ASM/height2 in all associations. Conclusions New sarcopenia definition by FNIH was better able to predict 6-year mortality among Korean men. Moreover, ethnic-specific cutoffs, the lowest quintile for each parameter, predicted the higher risk of mortality than the FNIH-recommended values. PMID:27832145

  14. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    of identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values.......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  15. The DWARF project

    Science.gov (United States)

    Christopoulou, P. E.

    2013-09-01

    In the era of staggering Kepler data and sophisticated approach of the automatic analysis, how obsolete are the traditional object-by-object multiwavelength photometric observations? Can we apply the new tools of classification, light curve modeling and timing analysis to study the newly detected or/and most interesting Eclipsing Binaries or to detect circumbinary bodies? In this talk, I will discuss developments in this area in the light of the recent DWARF project that promises additional useful science of binary stars within an extensive network of relatively small to medium-size telescopes with apertures of ~20-200 cm.

  16. Methane prediction in collieries

    CSIR Research Space (South Africa)

    Creedy, DP

    1999-06-01

    Full Text Available The primary aim of the project was to assess the current status of research on methane emission prediction for collieries in South Africa in comparison with methods used and advances achieved elsewhere in the world....

  17. System for automatic crate recognition

    Directory of Open Access Journals (Sweden)

    Radovan Kukla

    2012-01-01

    Full Text Available This contribution describes usage of computer vision and artificial intelligence methods for application. The method solves abuse of reverse vending machine. This topic has been solved as innovation voucher for the South Moravian Region. It was developed by Mendel university in Brno (Department of informatics – Faculty of Business and Economics and Department of Agricultural, Food and Environmental Engineering – Faculty of Agronomy together with the Czech subsidiary of Tomra. The project is focused on a possibility of integration industrial cameras and computers to process recognition of crates in the verse vending machine. The aim was the effective security system that will be able to save hundreds-thousands financial loss. As suitable development and runtime platform there was chosen product ControlWeb and VisionLab developed by Moravian Instruments Inc.

  18. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  19. Managing Returnable Containers Logistics - A Case Study Part II - Improving Visibility through Using Automatic Identification Technologies

    Directory of Open Access Journals (Sweden)

    Gretchen Meiser

    2011-05-01

    Full Text Available This case study is the result of a project conducted on behalf of a company that uses its own returnable containers to transport purchased parts from suppliers. The objective of this project was to develop a proposal to enable the company to more effectively track and manage its returnable containers. The research activities in support of this project included (1 the analysis and documentation of the physical flow and the information flow associated with the containers and (2 the investigation of new technologies to improve the automatic identification and tracking of containers. This paper explains the automatic identification technologies and important criteria for selection. A companion paper details the flow of information and containers within the logistics chain, and it identifies areas for improving the management of the containers.

  20. Development and Implications of a Predictive Cost Methodology for Modular Pumped Storage Hydropower (m-PSH) Projects in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Witt, Adam [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chalise, Dol Raj [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hadjerioua, Boualem [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Manwaring, Michael [MWH, Broomfield, CO (United States); Bishop, Norm [Knight Piesold, Denver, CO (United States)

    2016-10-01

    The slow pace of Pumped Storage Hydropower development in the US over the past twenty years has led to widespread interest in the feasibility and viability of alternative PSH designs, development schemes, and technologies. Since 2011, Oak Ridge National Lab has been exploring the economic viability of modular Pumped Storage Hydropower (m-PSH) development through targeted case studies, revenue simulations, and analysis of innovative configurations and designs. This paper outlines the development and supporting analysis of a scalable, comprehensive cost modeling tool designed to simulate the initial capital costs for a variety of potential m-PSH projects and deployment scenarios. The tool is used to explore and determine innovative research strategies that can improve the economic viability of m-PSH in US markets.