WorldWideScience

Sample records for automatic vol analysis

  1. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstrac...

  2. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  3. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  4. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  5. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  6. A background to risk analysis. Vol. 3

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justifi- cation or evaluation, this is given in the form of a chapter appenix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 3 contains chapters on quantification of risk, failure and accident probability, risk analysis and design, and examles of risk analysis for process plant. (BP)

  7. A background to risk analysis. Vol. 2

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 2 treats generic methods of qualitative failure analysis. (BP)

  8. A background to risk analysis. Vol. 4

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 4 treats human error in plant operation. (BP)

  9. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  10. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Goujon de Beauvivier, M.; Perez, J.-J.

    1979-01-01

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry [fr

  11. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  12. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  13. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... bring the solution of fully automatic analysis and understanding of human motion closer....

  14. Advances in automatic data analysis capabilities

    International Nuclear Information System (INIS)

    Benson, J.; Bipes, T.; Udpa, L.

    2009-01-01

    Utilities perform eddy current tests on nuclear power plant steam generator (SG) tubes to detect degradation. This paper summarizes the Electric Power Research Institute (EPRI) research to develop signal-processing algorithms that automate the analysis of eddy current test data. The research focuses on analyzing rotating probe and array probe data for detecting, classifying, and characterizing degradation in SG tubes. Automated eddy current data analysis systems for bobbin coil probe data have been available for more than a decade. However, automated data analysis systems for rotating and array probes have developed slowly because of the complexities of the inspection parameters associated with the data. Manual analysis of rotating probe data has been shown to be inconsistent and time consuming when flaw depth profiles are generated. Algorithms have been developed for detection of most all common steam generator degradation mechanisms. Included in the latest version of the developed software is the ability to perform automated defect profiling which is useful in tube integrity determinations. Profiling performed manually can be time consuming whereas automated profiling is performed in a fraction of the time and is much more repeatable. Recent advances in eddy current probe development have resulted in an array probe design capable of high-speed data acquisition over the full length of SG tubes. Probe qualification programs have demonstrated that array probes are capable of providing similar degradation detection capabilities to the rotating probe technology. However, to date, utilities have not used the array probe in the field on a large-scale basis due to the large amount of data analyst resources and time required to process the vast quantity of data generated by the probe. To address this obstacle, EPRI initiated a program to develop automatic data analysis algorithms for rotating and array probes. During the development process for both rotating and array

  15. QSFIT: automatic analysis of optical AGN spectra

    Science.gov (United States)

    Calderone, G.; Nicastro, L.; Ghisellini, G.; Dotti, M.; Sbarrato, T.; Shankar, F.; Colpi, M.

    2017-12-01

    We present QSFIT (Quasar Spectral Fitting package), a new software package to automatically perform the analysis of active galactic nuclei (AGNs) optical spectra. The software provides luminosity estimates for the AGN continuum, the Balmer continuum, both optical and ultraviolet iron blended complex, host galaxy and emission lines, as well as width, velocity offset and equivalent width of 20 emission lines. Improving on a number of previous studies on AGN spectral analysis, QSFIT fits all the components simultaneously, using an AGN continuum model which extends over the entire available spectrum, and is thus a probe of the actual AGN continuum whose estimates are scarcely influenced by localized features (e.g. emission lines) in the spectrum. We used QSFIT to analyse 71 251 optical spectra of Type 1 AGN at z < 2 (obtained by the Sloan Digital Sky Survey, SDSS) and to produce a publicly available catalogue of AGN spectral properties. Such catalogue allowed us (for the first time) to estimate the AGN continuum slope and the Balmer continuum luminosity on a very large sample, and to show that there is no evident correlation between these quantities the redshift. All data in the catalogue, the plots with best-fitting model and residuals, and the IDL code we used to perform the analysis, are available on a dedicated website. The whole fitting process is customizable for specific needs, and can be extended to analyse spectra from other data sources. The ultimate purpose of QSFIT is to allow astronomers to run standardized recipes to analyse the AGN data, in a simple, replicable and shareable way.

  16. Automatic Soccer Video Analysis and Summarization

    Science.gov (United States)

    Ekin, Ahmet; Tekalp, A. Murat

    2003-01-01

    We propose a fully automatic and computationally efficient framework for analysis and summarization of soccer videos using cinematic and object-based features. The proposed framework includes some novel low-level soccer video processing algorithms, such as dominant color region detection, robust shot boundary detection, and shot classification, as well as some higher-level algorithms for goal detection, referee detection, and penalty-box detection. The system can output three types of summaries: i) all slow-motion segments in a game, ii) all goals in a game, and iii) slow-motion segments classified according to object-based features. The first two types of summaries are based on cinematic features only for speedy processing, while the summaries of the last type contain higher-level semantics. The proposed framework is efficient, effective, and robust for soccer video processing. It is efficient in the sense that there is no need to compute object-based features when cinematic features are sufficient for the detection of certain events, e.g. goals in soccer. It is effective in the sense that the framework can also employ object-based features when needed to increase accuracy (at the expense of more computation). The efficiency, effectiveness, and the robustness of the proposed framework are demonstrated over a large data set, consisting of more than 13 hours of soccer video, captured at different countries and conditions.

  17. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  18. Dynamic Analysis of a Pendulum Dynamic Automatic Balancer

    Directory of Open Access Journals (Sweden)

    Jin-Seung Sohn

    2007-01-01

    Full Text Available The automatic dynamic balancer is a device to reduce the vibration from unbalanced mass of rotors. Instead of considering prevailing ball automatic dynamic balancer, pendulum automatic dynamic balancer is analyzed. For the analysis of dynamic stability and behavior, the nonlinear equations of motion for a system are derived with respect to polar coordinates by the Lagrange's equations. The perturbation method is applied to investigate the dynamic behavior of the system around the equilibrium position. Based on the linearized equations, the dynamic stability of the system around the equilibrium positions is investigated by the eigenvalue analysis.

  19. Describing Old Czech Declension Patterns for Automatic Text Analysis

    Czech Academy of Sciences Publication Activity Database

    Jínová, P.; Lehečka, Boris; Oliva jr., Karel

    -, č. 13 (2014), s. 7-17 ISSN 1579-8372 Institutional support: RVO:68378092 Keywords : Old Czech morphology * declension patterns * automatic text analysis * i-stems * ja-stems Subject RIV: AI - Linguistics

  20. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....

  1. Dental flossing and automaticity: a longitudinal moderated mediation analysis.

    Science.gov (United States)

    Hamilton, Kyra; Orbell, Sheina; Bonham, Mikaela; Kroon, Jeroen; Schwarzer, Ralf

    2018-06-01

    We investigated the role of normative support, behavioural automaticity, and action control in predicting dental flossing behaviour. Between May and October 2015, 629 Australian young adults completed a questionnaire assessing constructs of normative support and automaticity, and a 2-week follow-up of dental flossing behaviour and action control, resulting in n = 241 persons for longitudinal analysis. Findings supported the hypotheses that the effect of normative support on behaviour would be mediated via automaticity, and the effect of automaticity would be moderated by action control. Current results extend previous research to elucidate the mechanisms that help to understand predictors of oral hygiene behaviours and contribute to the cumulative evidence concerning self-regulatory and automatic components of health behaviour.

  2. Accuracy analysis of automatic distortion correction

    Directory of Open Access Journals (Sweden)

    Kolecki Jakub

    2015-06-01

    Full Text Available The paper addresses the problem of the automatic distortion removal from images acquired with non-metric SLR camera equipped with prime lenses. From the photogrammetric point of view the following question arises: is the accuracy of distortion control data provided by the manufacturer for a certain lens model (not item sufficient in order to achieve demanded accuracy? In order to obtain the reliable answer to the aforementioned problem the two kinds of tests were carried out for three lens models.

  3. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  4. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    Science.gov (United States)

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.

  5. Automatic analysis of signals during Eddy currents controls

    International Nuclear Information System (INIS)

    Chiron, D.

    1983-06-01

    A method and the corresponding instrument have been developed for automatic analysis of Eddy currents testing signals. This apparatus enables at the same time the analysis, every 2 milliseconds, of two signals at two different frequencies. It can be used either on line with an Eddy Current testing instrument or with a magnetic tape recorder [fr

  6. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman

    2015-06-01

    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  7. Automatic measurement system for light element isotope analysis

    International Nuclear Information System (INIS)

    Satake, Hiroshi; Ikegami, Kouichi.

    1990-01-01

    The automatic measurement system for the light element isotope analysis was developed by installing the specially designed inlet system which was controlled by a computer. The microcomputer system contains specific interface boards for the inlet system and the mass spectrometer, Micromass 602 E. All the components of the inlet and the computer system installed are easily available in Japan. Ten samples can be automatically measured as a maximum of. About 160 minutes are required for 10 measurements of δ 18 O values of CO 2 . Thus four samples can be measured per an hour using this system, while usually three samples for an hour using the manual operation. The automatized analysis system clearly has an advantage over the conventional method. This paper describes the details of this automated system, such as apparatuses used, the control procedure and the correction for reliable measurement. (author)

  8. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  9. Social Signals, their function, and automatic analysis: A survey

    NARCIS (Netherlands)

    Vinciarelli, Alessandro; Pantic, Maja; Bourlard, Hervé; Pentland, Alex

    2008-01-01

    Social Signal Processing (SSP) aims at the analysis of social behaviour in both Human-Human and Human-Computer interactions. SSP revolves around automatic sensing and interpretation of social signals, complex aggregates of nonverbal behaviours through which individuals express their attitudes

  10. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  11. A computer program for automatic gamma-ray spectra analysis

    International Nuclear Information System (INIS)

    Hiromura, Kazuyuki

    1975-01-01

    A computer program for automatic analysis of gamma-ray spectra obtained with a Ge(Li) detector is presented. The program includes a method by comparing the successive values of experimental data for the automatic peak finding and method of leastsquares for the peak fitting. The peak shape in the fitting routine is a 'modified Gaussian', which consists of two different Gaussians with the same height joined at the centroid. A quadratic form is chosen as a function representing the background. A maximum of four peaks can be treated in the fitting routine by the program. Some improvements in question are described. (auth.)

  12. Automatic quantitative morphological analysis of interacting galaxies

    Science.gov (United States)

    Shamir, Lior; Holincheck, Anthony; Wallin, John

    2013-08-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, the morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the galaxy mergers are visualized using phylogenies to provide a graph that reflects the morphological similarities between the different galaxy mergers, and thus quantitatively profile the morphology of galaxy mergers.

  13. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  14. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  15. Radiation dosimetry by automatic image analysis of dicentric chromosomes

    International Nuclear Information System (INIS)

    Bayley, R.; Carothers, A.; Farrow, S.; Gordon, J.; Ji, L.; Piper, J.; Rutovitz, D.; Stark, M.; Chen, X.; Wald, N.; Pittsburgh Univ., PA

    1991-01-01

    A system for scoring dicentric chromosomes by image analysis comprised fully automatic location of mitotic cells, automatic retrieval, focus and digitisation at high resolution, automatic rejection of nuclei and debris and detection and segmentation of chromosome clusters, automatic centromere location, and subsequent rapid interactive visual review of potential dicentric chromosomes to confirm positives and reject false positives. A calibration set of about 15000 cells was used to establish the quadratic dose response for 60 Co γ-irradiation. The dose-response function parameters were established by a maximum likelihood technique, and confidence limits in the dose response and in the corresponding inverse curve, of estimated dose for observed dicentric frequency, were established by Monte Carlo techniques. The system was validated in a blind trial by analysing a test comprising a total of about 8000 cells irradiated to 1 of 10 dose levels, and estimating the doses from the observed dicentric frequency. There was a close correspondence between the estimated and true doses. The overall sensitivity of the system in terms of the proportion of the total population of dicentrics present in the cells analysed that were detected by the system was measured to be about 40%. This implies that about 2.5 times more cells must be analysed by machine than by visual analysis. Taking this factor into account, the measured review time and false positive rates imply that analysis by the system of sufficient cells to provide the equivalent of a visual analysis of 500 cells would require about 1 h for operator review. (author). 20 refs.; 4 figs.; 5 tabs

  16. Automatic differential analysis of NMR experiments in complex samples.

    Science.gov (United States)

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2017-11-20

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Detection of tuberculosis by automatic cough sound analysis.

    Science.gov (United States)

    Botha, G H Renier; Theron, Grant; Warren, Rob; Klopper, Marisa; Dheda, Kheertan; van Helden, Paul; Niesler, Thomas R

    2018-03-15

    Globally, tuberculosis (TB) remains one of the most deadly diseases. Although several effective diagnosis methods exist, in lower income countries clinics may not be in a position to afford expensive equipment and employ the trained experts needed to interpret results. In these situations, symptoms including cough are commonly used to identify patients for testing. However, self-reported cough has suboptimal sensitivity and specificity, which may be improved by digital detection. This study investigates a simple and easily applied method for TB screening based on the automatic analysis of coughing sounds. A database of cough audio recordings was collected and used to develop statistical classifiers. These classifiers use short-term spectral information to automatically distinguish between the coughs of TB positive and TB negative patients with an accuracy of 78% and an AUC of 0.95. When a set of five clinical measurements is available in addition to the audio, this accuracy improves to 82%. By choosing an appropriate decision threshold, the system can achieve a sensitivity of 95% at a specificity of approximately 72%. The experiments suggest that the classifiers are using some spectral information that is not perceivable by the human auditory system, and that certain frequencies are more useful for classification than others. We conclude that automatic classification of coughing sounds may represent a viable low-cost and low-complexity screening method for TB. © 2018 Institute of Physics and Engineering in Medicine.

  18. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  19. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  20. Automatic selection of region of interest for radiographic texture analysis

    Science.gov (United States)

    Lan, Li; Giger, Maryellen L.; Wilkie, Joel R.; Vokes, Tamara J.; Chen, Weijie; Li, Hui; Lyons, Tracy; Chinander, Michael R.; Pham, Ann

    2007-03-01

    We have been developing radiographic texture analysis (RTA) for assessing osteoporosis and the related risk of fracture. Currently, analyses are performed on heel images obtained from a digital imaging device, the GE/Lunar PIXI, that yields both the bone mineral density (BMD) and digital images (0.2-mm pixels; 12-bit quantization). RTA is performed on the image data in a region-of-interest (ROI) placed just below the talus in order to include the trabecular structure in the analysis. We have found that variations occur from manually selecting this ROI for RTA. To reduce the variations, we present an automatic method involving an optimized Canny edge detection technique and parameterized bone segmentation, to define bone edges for the placement of an ROI within the predominantly calcaneus portion of the radiographic heel image. The technique was developed using 1158 heel images and then tested on an independent set of 176 heel images. Results from a subjective analysis noted that 87.5% of ROI placements were rated as "good". In addition, an objective overlap measure showed that 98.3% of images had successful ROI placements as compared to placement by an experienced observer at an overlap threshold of 0.4. In conclusion, our proposed method for automatic ROI selection on radiographic heel images yields promising results and the method has the potential to reduce intra- and inter-observer variations in selecting ROIs for radiographic texture analysis.

  1. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  2. Improving Automatic Text Classification by Integrated Feature Analysis

    Science.gov (United States)

    Busagala, Lazaro S. P.; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka

    Feature transformation in automatic text classification (ATC) can lead to better classification performance. Furthermore dimensionality reduction is important in ATC. Hence, feature transformation and dimensionality reduction are performed to obtain lower computational costs with improved classification performance. However, feature transformation and dimension reduction techniques have been conventionally considered in isolation. In such cases classification performance can be lower than when integrated. Therefore, we propose an integrated feature analysis approach which improves the classification performance at lower dimensionality. Moreover, we propose a multiple feature integration technique which also improves classification effectiveness.

  3. Analysis of Phonetic Transcriptions for Danish Automatic Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    2013-01-01

    Automatic speech recognition (ASR) relies on three resources: audio, orthographic transcriptions and a pronunciation dictionary. The dictionary or lexicon maps orthographic words to sequences of phones or phonemes that represent the pronunciation of the corresponding word. The quality of a speech...... recognition system depends heavily on the dictionary and the transcriptions therein. This paper presents an analysis of phonetic/phonemic features that are salient for current Danish ASR systems. This preliminary study consists of a series of experiments using an ASR system trained on the DK-PAROLE corpus...

  4. Corpus analysis and automatic detection of emotion-including keywords

    Science.gov (United States)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  5. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic......, uncommitted machine-learning based framework. Six different classifiers were evaluated in cross-validation schemes and the results showed that the presence of OA can be quantified by a bone structure marker. The performance of the developed marker reached a generalization area-under-the-ROC (AUC) of 0...

  6. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  7. Automatic particle-size analysis of HTGR nuclear fuel microspheres

    International Nuclear Information System (INIS)

    Mack, J.E.

    1977-01-01

    An automatic particle-size analyzer (PSA) has been developed at ORNL for measuring and counting samples of nuclear fuel microspheres in the diameter range of 300 to 1000 μm at rates in excess of 2000 particles per minute, requiring no sample preparation. A light blockage technique is used in conjunction with a particle singularizer. Each particle in the sample is sized, and the information is accumulated by a multi-channel pulse height analyzer. The data are then transferred automatically to a computer for calculation of mean diameter, standard deviation, kurtosis, and skewness of the distribution. Entering the sample weight and pre-coating data permits calculation of particle density and the mean coating thickness and density. Following this nondestructive analysis, the sample is collected and returned to the process line or used for further analysis. The device has potential as an on-line quality control device in processes dealing with spherical or near-spherical particles where rapid analysis is required for process control

  8. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  9. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  10. Automatic visual tracking and social behaviour analysis with multiple mice.

    Directory of Open Access Journals (Sweden)

    Luca Giancardo

    Full Text Available Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain and BTBR T+tf/J (a mouse model for autism spectrum disorders. Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2 interacting mice, and its versatility to deal with different

  11. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    events. Due to great variation in events, this method often fails to detect biologically relevant pressure variations. We have tried to develop a new concept for recognition of pressure events based on a neural network. Pressures were recorded for over 23 hours in 29 normal volunteers by means...... comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....... of a portable data recording system. A number of pressure events and non-events were selected from 9 recordings and used for training the network. The performance of the trained network was then verified on recordings from the remaining 20 volunteers. The accuracy and sensitivity of the two systems were...

  12. Automatic computer analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Phillips, G.W.

    1979-01-01

    Techniques for the automatic computer analysis of high-resolution gamma-ray spectra for peak area and position are discussed. The computer program HYPERMET is reviewed. The importance of keeping user input simple and short is emphasized. Peak-search methods are discussed and compared for efficiency. A semiempirical peak-shape function is presented which gives a good fit to the variety of peak shapes and intensities that may be found in a spectrum. The importance of a residual search in locating and fitting multiple peaks is demonstrated. Finally, it is shown that a severe bias may be encountered when the usual least-squares fitting methods are applied to peaks with very low statistics, and methods for alleviating this are presented. 7 figures

  13. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  14. Automatic Feature Interaction Analysis in PacoSuite

    Directory of Open Access Journals (Sweden)

    Wim Vanderperren

    2004-10-01

    Full Text Available In this paper, we build upon previous work that aims at recuperating aspect oriented ideas into component based software development. In that research, a composition adapter was proposed in order to capture crosscutting concerns in the PacoSuite component based methodology. A composition adapter is visually applied onto a given component composition and the changes it describes are automatically applied. Stacking multiple composition adapters onto the same component composition can however lead to unpredictable and undesired side-effects. In this paper, we propose a solution for this issue, widely known as the feature interaction problem. We present a classification of different interaction levels among composition adapters and the algorithms required to verify them. The proposed algorithms are however of exponential nature and depend on both the composition adapters and the component composition as a whole. In order to enhance the performance of our feature interaction analysis, we present a set of theorems that define the interaction levels solely in terms of the properties of the composition adapters themselves.

  15. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    Rocca, H.C.

    1976-10-01

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author) [es

  16. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  17. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  18. Handbook of nuclear engineering: vol 1: nuclear engineering fundamentals; vol 2: reactor design; vol 3: reactor analysis; vol 4: reactors of waste disposal and safeguards

    CERN Document Server

    2013-01-01

    The Handbook of Nuclear Engineering is an authoritative compilation of information regarding methods and data used in all phases of nuclear engineering. Addressing nuclear engineers and scientists at all academic levels, this five volume set provides the latest findings in nuclear data and experimental techniques, reactor physics, kinetics, dynamics and control. Readers will also find a detailed description of data assimilation, model validation and calibration, sensitivity and uncertainty analysis, fuel management and cycles, nuclear reactor types and radiation shielding. A discussion of radioactive waste disposal, safeguards and non-proliferation, and fuel processing with partitioning and transmutation is also included. As nuclear technology becomes an important resource of non-polluting sustainable energy in the future, The Handbook of Nuclear Engineering is an excellent reference for practicing engineers, researchers and professionals.

  19. Automatic adventitious respiratory sound analysis: A systematic review.

    Directory of Open Access Journals (Sweden)

    Renard Xaviero Adhi Pramono

    Full Text Available Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD, and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established.To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works.A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016 and IEEExplore (1984-2016 databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification.Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated.Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved.A total of 77 reports from the literature were included in this review. 55 (71.43% of the studies focused on wheeze, 40 (51.95% on crackle, 9 (11.69% on stridor, 9

  20. FluxFix: automatic isotopologue normalization for metabolic tracer analysis.

    Science.gov (United States)

    Trefely, Sophie; Ashwell, Peter; Snyder, Nathaniel W

    2016-11-25

    Isotopic tracer analysis by mass spectrometry is a core technique for the study of metabolism. Isotopically labeled atoms from substrates, such as [ 13 C]-labeled glucose, can be traced by their incorporation over time into specific metabolic products. Mass spectrometry is often used for the detection and differentiation of the isotopologues of each metabolite of interest. For meaningful interpretation, mass spectrometry data from metabolic tracer experiments must be corrected to account for the naturally occurring isotopologue distribution. The calculations required for this correction are time consuming and error prone and existing programs are often platform specific, non-intuitive, commercially licensed and/or limited in accuracy by using theoretical isotopologue distributions, which are prone to artifacts from noise or unresolved interfering signals. Here we present FluxFix ( http://fluxfix.science ), an application freely available on the internet that quickly and reliably transforms signal intensity values into percent mole enrichment for each isotopologue measured. 'Unlabeled' data, representing the measured natural isotopologue distribution for a chosen analyte, is entered by the user. This data is used to generate a correction matrix according to a well-established algorithm. The correction matrix is applied to labeled data, also entered by the user, thus generating the corrected output data. FluxFix is compatible with direct copy and paste from spreadsheet applications including Excel (Microsoft) and Google sheets and automatically adjusts to account for input data dimensions. The program is simple, easy to use, agnostic to the mass spectrometry platform, generalizable to known or unknown metabolites, and can take input data from either a theoretical natural isotopologue distribution or an experimentally measured one. Our freely available web-based calculator, FluxFix ( http://fluxfix.science ), quickly and reliably corrects metabolic tracer data for

  1. Automatic analysis of silver-stained comets by CellProfiler software.

    Science.gov (United States)

    González, J E; Romero, I; Barquinero, J F; García, O

    2012-10-09

    The comet assay is one of the most widely used methods to evaluate DNA damage and repair in eukaryotic cells. The comets can be measured by software, in a semi-automatic or automatic process. In this paper, we apply the CellProfiler open-source software for automatic analysis of comets from digitized images, reporting the percentage of tail DNA. A side-by-side comparison of CellProfiler with CASP software demonstrated good agreement between the two packages. Our work demonstrates that automatic measurement of silver-stained comets with open-source software is possible, providing significant time savings. © 2012 Elsevier B.V. All rights reserved.

  2. CURRENT STATE ANALYSIS OF AUTOMATIC BLOCK SYSTEM DEVICES, METHODS OF ITS SERVICE AND MONITORING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2014-01-01

    Full Text Available Purpose. Development of formalized description of automatic block system of numerical code based on the analysis of characteristic failures of automatic block system and procedure of its maintenance. Methodology. For this research a theoretical and analytical methods have been used. Findings. Typical failures of the automatic block systems were analyzed, as well as basic reasons of failure occur were found out. It was determined that majority of failures occurs due to defects of the maintenance system. Advantages and disadvantages of the current service technology of automatic block system were analyzed. Works that can be automatized by means of technical diagnostics were found out. Formal description of the numerical code of automatic block system as a graph in the state space of the system was carried out. Originality. The state graph of the numerical code of automatic block system that takes into account gradual transition from the serviceable condition to the loss of efficiency was offered. That allows selecting diagnostic information according to attributes and increasing the effectiveness of recovery operations in the case of a malfunction. Practical value. The obtained results of analysis and proposed the state graph can be used as the basis for the development of new means of diagnosing devices for automatic block system, which in turn will improve the efficiency and service of automatic block system devices in general.

  3. Background approximation in automatic qualitative X-ray-fluorescent analysis

    International Nuclear Information System (INIS)

    Jordanov, J.; Tsanov, T.; Stefanov, R.; Jordanov, N.; Paunov, M.

    1982-01-01

    An empirical method of finding the dependence of the background intensity (Isub(bg) on the wavelength is proposed, based on the approximation of the experimentally found values for the background in the course of an automatic qualitative X-ray fluorescent analysis with pre-set curve. It is assumed that the dependence I(lambda) will be well approximated by a curve of the type Isub(bg)=(lambda-lambda sub(o)sup(fsub(1)(lambda))exp[fsub(2)(lambda)] where fsub(1) (lambda) and f 2 (lambda) are linear functions with respect to the sought parameters. This assumption was checked out on a ''pure'' starch background, in which it is not known beforehand which points belong to the background. It was assumed that the dependence I(lambda) can be found from all minima in the spectrum. Three types of minima has been distinguished: 1. the lowest point between two well-solved X-ray lines; 2. a minimum obtained as a result of statistical fluctuations of the measured signal; 3. the lowest point between two overlapped lines. The minima strongly deviating from the background are removed from the obtained set. The sum-total of the remaining minima serves as a base for the approximation of the dependence I(lambda). The unknown parameters are determined by means of the LSM. The approximated curve obtained by this method is closer to the real background than the background determined by the method described by Kigaki Denki, as the effect of all recorded minima is taken into account. As an example the PbTe spectrum recorded with crystal LiF 220 is shown graphically. The curve well describes the background of the spectrum even in the regions in which there are no minima belonging to the background. (authors)

  4. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  5. Automatic acquisition and shape analysis of metastable peaks

    International Nuclear Information System (INIS)

    Maendli, H.; Robbiani, R.; Kuster, Th.; Seibl, J.

    1979-01-01

    A method for automatic acquisition and evaluation of metastable peaks due to transitions in the first field-free region of a double focussing mass spectrometer is presented. The data are acquired by computer-controlled repetitive scanning of the accelerating voltage and concomitant accumulation, the evaluation made by a mathematical derivatization of the resulting curve. Examples for application of the method are given. (Auth.)

  6. Automatic Expansion of a Social Judgment Lexicon for Sentiment Analysis

    OpenAIRE

    Silva, Mário J.; Carvalho, Paula; Costa, Carlos; Sarmento, Luís

    2010-01-01

    Reviewed by Francisco Couto We present a new method for automatically enlarging a sentiment lexicon for mining social judgments from text, i.e., extracting opinions about human subjects. We use a two-step approach: first, we find which adjectives can be used as human modifiers, and then we assign their polarity attribute. To identify the human modifiers, we developed a set of hand-crafted lexico-syntactic rules representing elementary copular and adnominal constructions where such predicat...

  7. A semantically annotated verbal autopsy corpus for automatic analysis of cause of death

    OpenAIRE

    Danso, S; Atwell, ES; Johnson, O; ten Asbroek, A; Soromekun, S; Edmond, K; Hurt, C; Hurt, L; Zandoh, C; Tawiah, C; Fenty, J; Etego, S; Agyei, S; Kirkwood, B

    2013-01-01

    An annotated corpus is essential to the development and evaluation of automatic approaches in corpus linguistics research. The biomedical domain is one area that is witnessing a high growth of corpus based approaches to the development of automatic systems. This paper presents a method employed in building a semantically annotated corpus of 11,741 Verbal Autopsy documents based on verbal records of deaths of mothers, stillbirths, and infants up to 1 year of age, captured for analysis in Ghana...

  8. A new approach for automatic control modeling, analysis and design in fully fuzzy environment

    OpenAIRE

    Gabr, Walaa Ibrahim

    2015-01-01

    The paper presents a new approach for the modeling, analysis and design of automatic control systems in fully fuzzy environment based on the normalized fuzzy matrices. The approach is also suitable for determining the propagation of fuzziness in automatic control and dynamical systems where all system coefficients are expressed as fuzzy parameters. A new consolidity chart is suggested based on the recently newly developed system consolidity index for testing the susceptibility of the system t...

  9. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  10. Analysis of measurements on wind turbine arrays. Vol. 1

    International Nuclear Information System (INIS)

    Hoeg, E.

    1990-12-01

    In 1989 a Danish electric power company initiated an analysis of eight wind turbine arrays. Data from this project is presented together with the explained results of the analyses and the output variations for individual arrays and for systems within the arrays. The models for prognosis are compared and evaluated in order to find that which is most effective. (AB)

  11. ANALYSIS OF SOFTWARE THREATS TO THE AUTOMATIC IDENTIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Marijan Gržan

    2017-01-01

    Full Text Available Automatic Identification System (AIS represents an important improvement in the fields of maritime security and vessel tracking. It is used by the signatory countries to the SOLAS Convention and by private and public providers. Its main advantage is that it can be used as an additional navigation aids, especially in avoiding collision at sea and in search and rescue operations. The present work analyses the functioning of the AIS System and the ways of exchanging data among the users. We also study one of the vulnerabilities of the System that can be abused by malicious users. The threat itself is analysed in detail in order to provide insight into the very process from the creation of a program to its implementation.

  12. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    Science.gov (United States)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  13. Analysis of automobile’s automatic control systems for the hill climbing start

    Directory of Open Access Journals (Sweden)

    Valeriy I. Klimenko

    2014-12-01

    Full Text Available To improve road safety while driving on the rise, facilitating the driver’s activity the automobile industry leaders are introducing automatic hill-hold control systems into the car design. This study purpose relates to the existing automatic start control systems’ design analysis. Analyzed are the existing design developments of automatic hill start assist control systems applied for driving at the start of the climbing. The effected research allows to select the scheme for further development of start driving automatic control systems. Further improvement of driving control systems and primarily the driver assistance hill-hold control systems is necessary to increase both the driving comfort and the traffic safety.

  14. Automatic computer aided analysis algorithms and system for adrenal tumors on CT images.

    Science.gov (United States)

    Chai, Hanchao; Guo, Yi; Wang, Yuanyuan; Zhou, Guohui

    2017-12-04

    The adrenal tumor will disturb the secreting function of adrenocortical cells, leading to many diseases. Different kinds of adrenal tumors require different therapeutic schedules. In the practical diagnosis, it highly relies on the doctor's experience to judge the tumor type by reading the hundreds of CT images. This paper proposed an automatic computer aided analysis method for adrenal tumors detection and classification. It consisted of the automatic segmentation algorithms, the feature extraction and the classification algorithms. These algorithms were then integrated into a system and conducted on the graphic interface by using MATLAB Graphic user interface (GUI). The accuracy of the automatic computer aided segmentation and classification reached 90% on 436 CT images. The experiments proved the stability and reliability of this automatic computer aided analytic system.

  15. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  16. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  17. Statistical language analysis for automatic exfiltration event detection.

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  18. Automatic Content Analysis; Part I of Scientific Report No. ISR-18, Information Storage and Retrieval...

    Science.gov (United States)

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…

  19. On the Application of Syntactic Methodologies in Automatic Text Analysis.

    Science.gov (United States)

    Salton, Gerard; And Others

    1990-01-01

    Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

  20. A full automatic system controlled with IBM-PC/XT micro-computer for neutron activation analysis

    International Nuclear Information System (INIS)

    Song Quanxun

    1992-01-01

    A full automatic system controlled with micro-computers for NAA is described. All processes are automatically completed with an IBM-PC/XT micro-computer. The device is stable, reliable, flexible and convenient for use and has many functions and applications in automatical analysis of long, middle and short lived nuclides. Due to a high working efficiency of the instrument and micro-computers, both time and power can be saved. This method can be applied in other nuclear analysis techniques

  1. Porosity determination on pyrocarbon using automatic quantitative image analysis

    International Nuclear Information System (INIS)

    Koizlik, K.; Uhlenbruck, U.; Delle, W.; Nickel, H.

    Methods of porosity determination are reviewed and applied to the measurement of the porosity of pyrocarbon. Specifically, the mathematical basis of stereology and the procedures involved in quantitative image analysis are detailed

  2. ATC Operations Analysis via Automatic Recognition of Clearances, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools for analysis of Air Traffic Control (ATC) operations, such as the Surface...

  3. ATC Operations Analysis via Automatic Recognition of Clearances, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools and data sources for analysis of Air Traffic Control (ATC) operations. The...

  4. Automatic flow analysis of digital subtraction angiography using independent component analysis in patients with carotid stenosis.

    Directory of Open Access Journals (Sweden)

    Han-Jui Lee

    Full Text Available Current time-density curve analysis of digital subtraction angiography (DSA provides intravascular flow information but requires manual vasculature selection. We developed an angiographic marker that represents cerebral perfusion by using automatic independent component analysis.We retrospectively analyzed the data of 44 patients with unilateral carotid stenosis higher than 70% according to North American Symptomatic Carotid Endarterectomy Trial criteria. For all patients, magnetic resonance perfusion (MRP was performed one day before DSA. Fixed contrast injection protocols and DSA acquisition parameters were used before stenting. The cerebral circulation time (CCT was defined as the difference in the time to peak between the parietal vein and cavernous internal carotid artery in a lateral angiogram. Both anterior-posterior and lateral DSA views were processed using independent component analysis, and the capillary angiogram was extracted automatically. The full width at half maximum of the time-density curve in the capillary phase in the anterior-posterior and lateral DSA views was defined as the angiographic mean transient time (aMTT; i.e., aMTTAP and aMTTLat. The correlations between the degree of stenosis, CCT, aMTTAP and aMTTLat, and MRP parameters were evaluated.The degree of stenosis showed no correlation with CCT, aMTTAP, aMTTLat, or any MRP parameter. CCT showed a strong correlation with aMTTAP (r = 0.67 and aMTTLat (r = 0.72. Among the MRP parameters, CCT showed only a moderate correlation with MTT (r = 0.67 and Tmax (r = 0.40. aMTTAP showed a moderate correlation with Tmax (r = 0.42 and a strong correlation with MTT (r = 0.77. aMTTLat also showed similar correlations with Tmax (r = 0.59 and MTT (r = 0.73.Apart from vascular anatomy, aMTT estimates brain parenchyma hemodynamics from DSA and is concordant with MRP. This process is completely automatic and provides immediate measurement of quantitative peritherapeutic brain parenchyma

  5. Attacking Automatic Video Analysis Algorithms: A Case Study of Google Cloud Video Intelligence API

    OpenAIRE

    Hosseini, Hossein; Xiao, Baicen; Clark, Andrew; Poovendran, Radha

    2017-01-01

    Due to the growth of video data on Internet, automatic video analysis has gained a lot of attention from academia as well as companies such as Facebook, Twitter and Google. In this paper, we examine the robustness of video analysis algorithms in adversarial settings. Specifically, we propose targeted attacks on two fundamental classes of video analysis algorithms, namely video classification and shot detection. We show that an adversary can subtly manipulate a video in such a way that a human...

  6. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    Science.gov (United States)

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  7. Automatic morphometry of synaptic boutons of cultured cells using granulometric analysis of digital images

    NARCIS (Netherlands)

    Prodanov, D.P.; Heeroma, Joost; Marani, Enrico

    2006-01-01

    Numbers, linear density, and surface area of synaptic boutons can be important parameters in studies on synaptic plasticity in cultured neurons. We present a method for automatic identification and morphometry of boutons based on filtering of digital images using granulometric analysis. Cultures of

  8. System for automatic x-ray-image analysis, measurement, and sorting of laser fusion targets

    International Nuclear Information System (INIS)

    Singleton, R.M.; Perkins, D.E.; Willenborg, D.L.

    1980-01-01

    This paper describes the Automatic X-Ray Image Analysis and Sorting (AXIAS) system which is designed to analyze and measure x-ray images of opaque hollow microspheres used as laser fusion targets. The x-ray images are first recorded on a high resolution film plate. The AXIAS system then digitizes and processes the images to accurately measure the target parameters and defects. The primary goals of the AXIAS system are: to provide extremely accurate and rapid measurements, to engineer a practical system for a routine production environment and to furnish the capability of automatically measuring an array of images for sorting and selection

  9. Automatic snoring sounds detection from sleep sounds via multi-features analysis.

    Science.gov (United States)

    Wang, Can; Peng, Jianxin; Song, Lijuan; Zhang, Xiaowen

    2017-03-01

    Obstructive sleep apnea hypopnea syndrome (OSAHS) is a serious respiratory disorder. Snoring is the most intuitively characteristic symptom of OSAHS. Recently, many studies have attempted to develop snore analysis technology for diagnosing OSAHS. The preliminary and essential step in such diagnosis is to automatically segment snoring sounds from original sleep sounds. This study presents an automatic snoring detection algorithm that detects potential snoring episodes using an adaptive effective-value threshold method, linear and nonlinear feature extraction using maximum power ratio, sum of positive/negative amplitudes, 500 Hz power ratio, spectral entropy (SE) and sample entropy (SampEn), and automatic snore/nonsnore classification using a support vector machine. The results show that SampEn provides higher classification accuracy than SE. Furthermore, the proposed automatic detection method achieved over 94.0% accuracy when identifying snoring and nonsnoring sounds despite using small training sets. The sensitivity and accuracy of the results demonstrate that the proposed snoring detection method can effectively classify snoring and nonsnoring sounds, thus enabling the automatic detection of snoring.

  10. A microprocessor based picture analysis system for automatic track measurements

    International Nuclear Information System (INIS)

    Heinrich, W.; Trakowski, W.; Beer, J.; Schucht, R.

    1982-01-01

    In the last few years picture analysis became a powerful technique for measurements of nuclear tracks in plastic detectors. For this purpose rather expensive commercial systems are available. Two inexpensive microprocessor based systems with different resolution were developed. The video pictures of particles seen through a microscope are digitized in real time and the picture analysis is done by software. The microscopes are equipped with stages driven by stepping motors, which are controlled by separate microprocessors. A PDP 11/03 supervises the operation of all microprocessors and stores the measured data on its mass storage devices. (author)

  11. Automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, Marcel; Spreeuwers, Lieuwe Jan; Quist, Marcel

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the

  12. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    Science.gov (United States)

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  13. Automatic settlement analysis of single-layer armour layers

    NARCIS (Netherlands)

    Hofland, B.; van gent, Marcel

    2016-01-01

    A method to quantify, analyse, and present the settlement of single-layer concrete armour layers of coastal structures is presented. The use of the image processing technique for settlement analysis is discussed based on various modelling
    studies performed over the years. The accuracy of the

  14. the evaluation of a discrete automatic analysis system

    African Journals Online (AJOL)

    1971-05-29

    May 29, 1971 ... Through the courtesy of the manufacturerst we have recently carried out a limited evaluation of the Analmatic discrete analysis system. The present study presents some of the factual data acquired during this evaluation, to- gether with a consideration of the relative advantages and disadvantages of the ...

  15. AntDAS: Automatic Data Analysis Strategy for UPLC-QTOF-Based Nontargeted Metabolic Profiling Analysis.

    Science.gov (United States)

    Fu, Hai-Yan; Guo, Xiao-Ming; Zhang, Yue-Ming; Song, Jing-Jing; Zheng, Qing-Xia; Liu, Ping-Ping; Lu, Peng; Chen, Qian-Si; Yu, Yong-Jie; She, Yuanbin

    2017-10-17

    High-quality data analysis methodology remains a bottleneck for metabolic profiling analysis based on ultraperformance liquid chromatography-quadrupole time-of-flight mass spectrometry. The present work aims to address this problem by proposing a novel data analysis strategy wherein (1) chromatographic peaks in the UPLC-QTOF data set are automatically extracted by using an advanced multiscale Gaussian smoothing-based peak extraction strategy; (2) a peak annotation stage is used to cluster fragment ions that belong to the same compound. With the aid of high-resolution mass spectrometer, (3) a time-shift correction across the samples is efficiently performed by a new peak alignment method; (4) components are registered by using a newly developed adaptive network searching algorithm; (5) statistical methods, such as analysis of variance and hierarchical cluster analysis, are then used to identify the underlying marker compounds; finally, (6) compound identification is performed by matching the extracted peak information, involving high-precision m/z and retention time, against our compound library containing more than 500 plant metabolites. A manually designed mixture of 18 compounds is used to evaluate the performance of the method, and all compounds are detected under various concentration levels. The developed method is comprehensively evaluated by an extremely complex plant data set containing more than 2000 components. Results indicate that the performance of the developed method is comparable with the XCMS. The MATLAB GUI code is available from http://software.tobaccodb.org/software/antdas .

  16. Learning Enterprise Malware Triage from Automatic Dynamic Analysis

    Science.gov (United States)

    2013-03-01

    variety of responses to mitigate the threat of malware. To counteract signature-based malware detection, such as antivirus products and static analysis...services, such as antivirus software, anti-spyware software or host-based intrusion detection system software. Another source of executable files on...classification process involves building a model on two or more distinct classes of training samples, and the model then attempts to predict the

  17. A framework for automatic heart sound analysis without segmentation.

    Science.gov (United States)

    Yuenyong, Sumeth; Nishihara, Akinori; Kongprawechnon, Waree; Tungpimolrut, Kanokvate

    2011-02-09

    A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS). The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR), and 0.90 under impulse noise up to 0.3 s duration. The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  18. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  19. Image analysis in automatic system of pollen recognition

    Directory of Open Access Journals (Sweden)

    Piotr Rapiejko

    2012-12-01

    Full Text Available In allergology practice and research, it would be convenient to receive pollen identification and monitoring results in much shorter time than it comes from human identification. Image based analysis is one of the approaches to an automated identification scheme for pollen grain and pattern recognition on such images is widely used as a powerful tool. The goal of such attempt is to provide accurate, fast recognition and classification and counting of pollen grains by computer system for monitoring. The isolated pollen grain are objects extracted from microscopic image by CCD camera and PC computer under proper conditions for further analysis. The algorithms are based on the knowledge from feature vector analysis of estimated parameters calculated from grain characteristics, including morphological features, surface features and other applicable estimated characteristics. Segmentation algorithms specially tailored to pollen object characteristics provide exact descriptions of pollen characteristics (border and internal features already used by human expert. The specific characteristics and its measures are statistically estimated for each object. Some low level statistics for estimated local and global measures of the features establish the feature space. Some special care should be paid on choosing these feature and on constructing the feature space to optimize the number of subspaces for higher recognition rates in low-level classification for type differentiation of pollen grains.The results of estimated parameters of feature vector in low dimension space for some typical pollen types are presented, as well as some effective and fast recognition results of performed experiments for different pollens. The findings show the ewidence of using proper chosen estimators of central and invariant moments (M21, NM2, NM3, NM8 NM9, of tailored characteristics for good enough classification measures (efficiency > 95%, even for low dimensional classifiers

  20. Rhodium in car exhaust tips by total automatic activation analysis

    International Nuclear Information System (INIS)

    Grass, F.; Westphal, G.P.; Lemmel, H.; Sterba, J.

    2007-01-01

    Exhaust systems of modern cars contain catalysts for the reduction of CO, NO x and hydrocarbons. These catalysts are made of ceramic materials with a large surface on which platinum metals catalyse the oxidation. The catalysts contain approximately 2 g of platinum and 0.4 g of rhodium. Recently platinum is being replaced by palladium. During driving the platinum-group elements (PGEs) are expelled from the tip in fine particles and are deposited in the environment. For a projected study of emissions from cars driven on streets and highways it is important to know which elements can be measured by short time activation analysis without any chemical procedure. (author)

  1. Automatic forensic analysis of automotive paints using optical microscopy.

    Science.gov (United States)

    Thoonen, Guy; Nys, Bart; Vander Haeghen, Yves; De Roy, Gilbert; Scheunders, Paul

    2016-02-01

    The timely identification of vehicles involved in an accident, such as a hit-and-run situation, bears great importance in forensics. To this end, procedures have been defined for analyzing car paint samples that combine techniques such as visual analysis and Fourier transform infrared spectroscopy. This work proposes a new methodology in order to automate the visual analysis using image retrieval. Specifically, color and texture information is extracted from a microscopic image of a recovered paint sample, and this information is then compared with the same features for a database of paint types, resulting in a shortlist of candidate paints. In order to demonstrate the operation of the methodology, a test database has been set up and two retrieval experiments have been performed. The first experiment quantifies the performance of the procedure for retrieving exact matches, while the second experiment emulates the real-life situation of paint samples that experience changes in color and texture over time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Microsoft Academic Automatic Document Searches: Accuracy for Journal Articles and Suitability for Citation Analysis

    OpenAIRE

    Thelwall, Mike

    2017-01-01

    Microsoft Academic is a free academic search engine and citation index that is similar to Google Scholar but can be automatically queried. Its data is potentially useful for bibliometric analysis if it is possible to search effectively for individual journal articles. This article compares different methods to find journal articles in its index by searching for a combination of title, authors, publication year and journal name and uses the results for the widest published correlation analysis...

  3. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  4. Automatic localization of cerebral cortical malformations using fractal analysis

    Science.gov (United States)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  5. Towards automatic analysis of dynamic radionuclide studies using principal-components factor analysis

    International Nuclear Information System (INIS)

    Nigran, K.S.; Barber, D.C.

    1985-01-01

    A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99 Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)

  6. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  7. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    Science.gov (United States)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  8. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  9. Technical characterization by image analysis: an automatic method of mineralogical studies

    International Nuclear Information System (INIS)

    Oliveira, J.F. de

    1988-01-01

    The application of a modern method of image analysis fully automated for the study of grain size distribution modal assays, degree of liberation and mineralogical associations is discussed. The image analyser is interfaced with a scanning electron microscope and an energy dispersive X-rays analyser. The image generated by backscattered electrons is analysed automatically and the system has been used in accessment studies of applied mineralogy as well as in process control in the mining industry. (author) [pt

  10. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    Science.gov (United States)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  11. Text analysis and automatic creation of English reading activities using corpora

    Directory of Open Access Journals (Sweden)

    José Lopes Moreira Filho

    2015-07-01

    Full Text Available The use of corpora in language teaching is an important topic, since practice is aimed at ensuring that the teaching material is focused on the language in use. The process of creating activities for language teaching can be improved by using corpus data and computational tools in linguistic analysis. This study describes a system for text analysis and automatic creation of English reading activities. The results show that the system allows the development of teaching materials that focus on language in use and it also provides varied linguistic analysis, with less human effort, to the task of developing reading activities.

  12. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT.

    Science.gov (United States)

    Murphy, Keelin; Pluim, Josien P W; van Rikxoort, Eva M; de Jong, Pim A; de Hoop, Bartjan; Gietema, Hester A; Mets, Onno; de Bruijne, Marleen; Lo, Pechin; Prokop, Mathias; van Ginneken, Bram

    2012-03-01

    To analyze pulmonary function using a fully automatic technique which processes pairs of thoracic CT scans acquired at breath-hold inspiration and expiration, respectively. The following research objectives are identified to: (a) describe and systematically analyze the processing pipeline and its results; (b) verify that the quantitative, regional ventilation measurements acquired through CT are meaningful for pulmonary function analysis; (c) identify the most effective of the calculated measurements in predicting pulmonary function; and (d) demonstrate the potential of the system to deliver clinically important information not available through conventional spirometry. A pipeline of automatic segmentation and registration techniques is presented and demonstrated on a database of 216 subjects well distributed over the various stages of COPD (chronic obstructive pulmonary disorder). Lungs, fissures, airways, lobes, and vessels are automatically segmented in both scans and the expiration scan is registered with the inspiration scan using a fully automatic nonrigid registration algorithm. Segmentations and registrations are examined and scored by expert observers to analyze the accuracy of the automatic methods. Quantitative measures representing ventilation are computed at every image voxel and analyzed to provide information about pulmonary function, both globally and on a regional basis. These CT derived measurements are correlated with results from spirometry tests and used as features in a kNN classifier to assign COPD global initiative for obstructive lung disease (GOLD) stage. The steps of anatomical segmentation (of lungs, lobes, and vessels) and registration in the workflow were shown to perform very well on an individual basis. All CT-derived measures were found to have good correlation with spirometry results, with several having correlation coefficients, r, in the range of 0.85-0.90. The best performing kNN classifier succeeded in classifying 67% of

  13. Automatic analysis of 2D polyacrylamide gels in the diagnosis of DNA polymorphisms

    Science.gov (United States)

    2013-01-01

    Introduction The analysis of polyacrylamide gels is currently carried out manually or automatically. In the automatic method, there are limitations related to the acceptable degree of distortion of lane and band continuity. The available software cannot deal satisfactorily with this type of situations. Therefore, the paper presents an original image analysis method devoid of the aforementioned drawbacks. Material This paper examines polyacrylamide gel images from Li-Cor DNA Sequencer 4300S resulting from the use of the electrophoretic separation of DNA fragments. The acquired images have a resolution dependent on the length of the analysed DNA fragments and typically it is MG×NG=3806×1027 pixels. The images are saved in TIFF format with a grayscale resolution of 16 bits/pixel. The presented image analysis method was performed on gel images resulting from the analysis of DNA methylome profiling in plants exposed to drought stress, carried out with the MSAP (Methylation Sensitive Amplification Polymorphism) technique. Results The results of DNA polymorphism analysis were obtained in less than one second for the Intel Core™ 2 Quad CPU Q9300@2.5GHz, 8GB RAM. In comparison with other known methods, specificity was 0.95, sensitivity = 0.94 and AUC (Area Under Curve) = 0.98. Conclusions It is possible to carry out this method of DNA polymorphism analysis on distorted images of polyacrylamide gels. The method is fully automatic and does not require any operator intervention. Compared with other methods, it produces the best results and the resulting image is easy to interpret. The presented method of measurement is used in the practical analysis of polyacrylamide gels in the Department of Genetics at the University of Silesia in Katowice, Poland. PMID:23835039

  14. Automatic analysis of image quality control for Image Guided Radiation Therapy (IGRT) devices in external radiotherapy

    International Nuclear Information System (INIS)

    Torfeh, Tarraf

    2009-01-01

    On-board imagers mounted on a radiotherapy treatment machine are very effective devices that improve the geometric accuracy of radiation delivery. However, a precise and regular quality control program is required in order to achieve this objective. Our purpose consisted of developing software tools dedicated to an automatic image quality control of IGRT devices used in external radiotherapy: 2D-MV mode for measuring patient position during the treatment using high energy images, 2D-kV mode (low energy images) and 3D Cone Beam Computed Tomography (CBCT) MV or kV mode, used for patient positioning before treatment. Automated analysis of the Winston and Lutz test was also proposed. This test is used for the evaluation of the mechanical aspects of treatment machines on which additional constraints are carried out due to the on-board imagers additional weights. Finally, a technique of generating digital phantoms in order to assess the performance of the proposed software tools is described. Software tools dedicated to an automatic quality control of IGRT devices allow reducing by a factor of 100 the time spent by the medical physics team to analyze the results of controls while improving their accuracy by using objective and reproducible analysis and offering traceability through generating automatic monitoring reports and statistical studies. (author) [fr

  15. Automatic progressive damage detection of rotor bar in induction motor using vibration analysis and multiple classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Cruz-Vega, Israel; Rangel-Magdaleno, Jose; Ramirez-Cortes, Juan; Peregrina-Barreto, Hayde [Santa María Tonantzintla, Puebla (Mexico)

    2017-06-15

    There is an increased interest in developing reliable condition monitoring and fault diagnosis systems of machines like induction motors; such interest is not only in the final phase of the failure but also at early stages. In this paper, several levels of damage of rotor bars under different load conditions are identified by means of vibration signals. The importance of this work relies on a simple but effective automatic detection algorithm of the damage before a break occurs. The feature extraction is based on discrete wavelet analysis and auto- correlation process. Then, the automatic classification of the fault degree is carried out by a binary classification tree. In each node, com- paring the learned levels of the breaking off correctly identifies the fault degree. The best results of classification are obtained employing computational intelligence techniques like support vector machines, multilayer perceptron, and the k-NN algorithm, with a proper selection of their optimal parameters.

  16. Automatic progressive damage detection of rotor bar in induction motor using vibration analysis and multiple classifiers

    International Nuclear Information System (INIS)

    Cruz-Vega, Israel; Rangel-Magdaleno, Jose; Ramirez-Cortes, Juan; Peregrina-Barreto, Hayde

    2017-01-01

    There is an increased interest in developing reliable condition monitoring and fault diagnosis systems of machines like induction motors; such interest is not only in the final phase of the failure but also at early stages. In this paper, several levels of damage of rotor bars under different load conditions are identified by means of vibration signals. The importance of this work relies on a simple but effective automatic detection algorithm of the damage before a break occurs. The feature extraction is based on discrete wavelet analysis and auto- correlation process. Then, the automatic classification of the fault degree is carried out by a binary classification tree. In each node, com- paring the learned levels of the breaking off correctly identifies the fault degree. The best results of classification are obtained employing computational intelligence techniques like support vector machines, multilayer perceptron, and the k-NN algorithm, with a proper selection of their optimal parameters.

  17. Automatic Evaluation for E-Learning Using Latent Semantic Analysis: A Use Case

    Directory of Open Access Journals (Sweden)

    Mireia Farrús

    2013-03-01

    Full Text Available Assessment in education allows for obtaining, organizing, and presenting information about how much and how well the student is learning. The current paper aims at analysing and discussing some of the most state-of-the-art assessment systems in education. Later, this work presents a specific use case developed for the Universitat Oberta de Catalunya, which is an online university. An automatic evaluation tool is proposed that allows the student to evaluate himself anytime and receive instant feedback. This tool is a web-based platform, and it has been designed for engineering subjects (i.e., with math symbols and formulas in Catalan and Spanish. Particularly, the technique used for automatic assessment is latent semantic analysis. Although the experimental framework from the use case is quite challenging, results are promising.

  18. Automaticity in acute ischemia: Bifurcation analysis of a human ventricular model

    Science.gov (United States)

    Bouchard, Sylvain; Jacquemet, Vincent; Vinet, Alain

    2011-01-01

    Acute ischemia (restriction in blood supply to part of the heart as a result of myocardial infarction) induces major changes in the electrophysiological properties of the ventricular tissue. Extracellular potassium concentration ([Ko+]) increases in the ischemic zone, leading to an elevation of the resting membrane potential that creates an “injury current” (IS) between the infarcted and the healthy zone. In addition, the lack of oxygen impairs the metabolic activity of the myocytes and decreases ATP production, thereby affecting ATP-sensitive potassium channels (IKatp). Frequent complications of myocardial infarction are tachycardia, fibrillation, and sudden cardiac death, but the mechanisms underlying their initiation are still debated. One hypothesis is that these arrhythmias may be triggered by abnormal automaticity. We investigated the effect of ischemia on myocyte automaticity by performing a comprehensive bifurcation analysis (fixed points, cycles, and their stability) of a human ventricular myocyte model [K. H. W. J. ten Tusscher and A. V. Panfilov, Am. J. Physiol. Heart Circ. Physiol.AJPHAP0363-613510.1152/ajpheart.00109.2006 291, H1088 (2006)] as a function of three ischemia-relevant parameters [Ko+], IS, and IKatp. In this single-cell model, we found that automatic activity was possible only in the presence of an injury current. Changes in [Ko+] and IKatp significantly altered the bifurcation structure of IS, including the occurrence of early-after depolarization. The results provide a sound basis for studying higher-dimensional tissue structures representing an ischemic heart.

  19. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  20. A new approach for automatic control modeling, analysis and design in fully fuzzy environment

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2015-09-01

    Full Text Available The paper presents a new approach for the modeling, analysis and design of automatic control systems in fully fuzzy environment based on the normalized fuzzy matrices. The approach is also suitable for determining the propagation of fuzziness in automatic control and dynamical systems where all system coefficients are expressed as fuzzy parameters. A new consolidity chart is suggested based on the recently newly developed system consolidity index for testing the susceptibility of the system to withstand changes due to any system or input parameters changes effects. Implementation procedures are elaborated for the consolidity analysis of existing control systems and the design of new ones, including systems comparisons based on such implementation consolidity results. Application of the proposed methodology is demonstrated through illustrative examples, covering fuzzy impulse response of systems, fuzzy Routh–Hurwitz stability criteria, fuzzy controllability and observability. Moreover, the use of the consolidity chart for the appropriate design of control system is elaborated through handling the stabilization of inverted pendulum through pole placement technique. It is also shown that the regions comparison in consolidity chart is based on type of consolidity region shape such as elliptical or circular, slope or angle in degrees of the centerline of the geometric shape, the centroid of the geometric shape, area of the geometric shape, length of principal diagonals of the shape, and the diversity ratio of consolidity points for each region. Finally, it is recommended that the proposed consolidity chart approach be extended as a unified theory for modeling, analysis and design of continuous and digital automatic control systems operating in fully fuzzy environment.

  1. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  2. Porosity determination on pyrocarbon by means of automatic quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koizlik, K.; Uhlenbruck, U.; Delle, W.; Hoven, H.; Nickel, H.

    1976-05-01

    For a long time, the quantitative image analysis is well known as a method for quantifying the results of material investigation basing on ceramography. The development of the automatic image analyzers has made it a fast and elegant procedure for evaluation. Since 1975, it is used in IRW to determine easily and routinely the macroporosity and by this the density of the pyrocarbon coatings of nuclear fuel particles. This report describes the definition of measuring parameters, the measuring procedure, the mathematical calculations, and first experimental and mathematical results.

  3. Automatic analysis of digitized TV-images by a computer-driven optical microscope

    International Nuclear Information System (INIS)

    Rosa, G.; Di Bartolomeo, A.; Grella, G.; Romano, G.

    1997-01-01

    New methods of image analysis and three-dimensional pattern recognition were developed in order to perform the automatic scan of nuclear emulsion pellicles. An optical microscope, with a motorized stage, was equipped with a CCD camera and an image digitizer, and interfaced to a personal computer. Selected software routines inspired the design of a dedicated hardware processor. Fast operation, high efficiency and accuracy were achieved. First applications to high-energy physics experiments are reported. Further improvements are in progress, based on a high-resolution fast CCD camera and on programmable digital signal processors. Applications to other research fields are envisaged. (orig.)

  4. Automatic complex building reconstruction from LIDAR based on hierarchical structure analysis

    Science.gov (United States)

    Li, Lelin; Zhang, Jing; Jiang, Wangshou

    2009-10-01

    Since manual surface reconstruction is very costly and time consuming, the development of automatic algorithm is of great importance. In this paper a fully automated technique based on hierarchical structure analysis of the building to extract urban building models from LIDAR data is presented. In the processing of reconstruction, the existing automatic algorithm can solve some simple building reconstructions, such as flat roof, gabled roof. As to complex buildings, many researchers use external information or manual interaction for help because of the complexity of the reconstruction and the uncertainty of the building models especially in urban areas. The contour has the characteristics of closed loop, not intersect and deterministic topological relationship, which can be used to extract building ROI (region of interesting). A contours tree is constructed, the topological relationships between the different contours which extracted by TIN from the LIDAR data are established, then the relationships among each hierarchical model can be determined by the analysis of the topological relationship among contour clusters and a component tree corresponding to the building can be constructed by tracing the contours tree. The accurate edges of hierarchical model can be gained by the "polarized cornerity index"-based polygonal approximation of the contour. Especially, a 3D model recognition based on 2D shape recognition is employed. According to the characteristics of the contours, the category of the primitive parts can be classified. We assemble the hierarchical models by using the topological relationships among layers, then, the complete model of the building can be obtained. Experimental results show that the proposed algorithm is suitable for automatically producing building models including most complex buildings from LIDAR data in urban areas.

  5. The Avenging Females: A Comparative Analysis of Kill Bill Vol.1-2, Death Proof and Sympathy for Lady Vengeance

    Directory of Open Access Journals (Sweden)

    Basak Göksel Demiray

    2012-04-01

    Full Text Available This paper provides a comparative analysis of Quentin Tarantino’s Kill Bill Vol.1-2 (2003, 2004, Death Proof (2007 and Park Chan Wook’s Sympathy for Lady Vengeance (Chinjeolhan Geumjassi, 2005. The primary objectives of this study are: (1 to reveal the gender-biases inherent to the fundamental discursive structures of the foregoing films; (2 to compare and contrast the films through an analysis of the ‘gaze(s’ and possible ‘pleasures’,  which are inherent in their narratives, in relation to Laura Mulvey’s and Carol Clover’s approaches; and (3 to distinguish Kill Bill Vol.1-2 from the foregoing two and the ‘avenging female’ clichés of the other horror/violence movies in the context of the replaced positionings of its protagonist and antagonist inherent in its distinct narrative style.

  6. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Science.gov (United States)

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Multidirectional channeling analysis of epitaxial CdTe layers using an automatic RBS/channeling system

    Energy Technology Data Exchange (ETDEWEB)

    Wielunski, L.S.; Kenny, M.J. [CSIRO, Lindfield, NSW (Australia). Applied Physics Div.

    1993-12-31

    Rutherford Backscattering Spectrometry (RBS) is an ion beam analysis technique used in many fields. The high depth and mass resolution of RBS make this technique very useful in semiconductor material analysis [1]. The use of ion channeling in combination with RBS creates a powerful technique which can provide information about crystal quality and structure in addition to mass and depth resolution [2]. The presence of crystal defects such as interstitial atoms, dislocations or dislocation loops can be detected and profiled [3,4]. Semiconductor materials such as CdTe, HgTe and Hg+xCd{sub 1-x}Te generate considerable interest due to applications as infrared detectors in many technological areas. The present paper demonstrates how automatic RBS and multidirectional channeling analysis can be used to evaluate crystal quality and near surface defects. 6 refs., 1 fig.

  8. AUTOMATIC PEDESTRIAN CROSSING DETECTION AND IMPAIRMENT ANALYSIS BASED ON MOBILE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    X. Liu

    2017-09-01

    Full Text Available Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians’ lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  9. In-air PIXE set-up for automatic analysis of historical document inks

    Science.gov (United States)

    Budnar, Miloš; Simčič, Jure; Rupnik, Zdravko; Uršič, Mitja; Pelicon, Primož; Kolar, Jana; Strlič, Matija

    2004-06-01

    The iron gall inks were one of the writing materials mostly applied in historical documents of the western civilization. Due to the ink corrosive character, the documents are faced with a danger of being seriously, and in some cases also irreversibly changed. The elemental composition of the inks is an important information for taking the adequate conservation action [Project InkCor, http://www.infosrvr.nuk.uni-lj.si/jana/Inkcor/index.htm, and references within]. Here, the in-air PIXE analysis offers an indispensable tool due to its sensitivity and almost non-destructive character. An experimental approach developed for precise and automatic analysis of documents at Jožef Stefan Institute Tandetron accelerator is presented. The selected documents were mounted, one at the time, on the positioning board and the chosen ink spots on the sample were irradiated by 1.7 MeV protons. The data acquisition on the selected ink spots is done automatically throughout the measuring pattern determined prior to the measurement. The chemical elements identified in the documents ranged from Si to Pb, and between them the significant iron gall ink components like Fe, S, K, Cu, Zn, Co, Mn, Ni were deduced with precision of ±10%. The measurements were done non-destructively and no visible damage was observed on the irradiated documents.

  10. Analysis and Development of FACE Automatic Apparatus for Rapid Identification of Transuranium Isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Sebesta, Edward Henri [Univ. of California, Berkeley, CA (United States)

    1978-09-01

    A description of and operating manual for the FACE Automatic Apparatus has been written along with a documentation of the FACE machine operating program, to provide a user manual for the FACE Automatic Apparatus. In addition, FACE machine performance was investigated to improve transuranium throughput. Analysis of the causes of transuranium isotope loss was undertaken both chemical and radioactive. To lower radioactive loss, the dynamics of the most time consuming step of the FACE machine, the chromatographic column output droplet drying and flaming, in preparation of sample for alpha spectroscopy and counting, was investigated. A series of droplets were dried in an experimental apparatus demonstrating that droplets could be dried significantly faster through more intensie heating, enabling the FACE machine cycle to be shortened by 30-60 seconds. Proposals incorporating these ideas were provided for FACE machine development. The 66% chemical loss of product was analyzed and changes were proposed to reduce the radioisotopes product loss. An analysis of the chromatographic column was also provided. All operating steps in the FACE machine are described and analyzed to provide a complete guide, along with the proposals for machine improvement.

  11. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    Science.gov (United States)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  12. Automatic analysis of image of surface structure of cell wall-deficient EVC.

    Science.gov (United States)

    Li, S; Hu, K; Cai, N; Su, W; Xiong, H; Lou, Z; Lin, T; Hu, Y

    2001-01-01

    Some computer applications for cell characterization in medicine and biology, such as analysis of surface structure of cell wall-deficient EVC (El Tor Vibrio of Cholera), operate with cell samples taken from very small areas of interest. In order to perform texture characterization in such an application, only a few texture operators can be employed: the operators should be insensitive to noise and image distortion and be reliable in order to estimate texture quality from images. Therefore, we introduce wavelet theory and mathematical morphology to analyse the cellular surface micro-area image obtained by SEM (Scanning Electron Microscope). In order to describe the quality of surface structure of cell wall-deficient EVC, we propose a fully automatic computerized method. The image analysis process is carried out in two steps. In the first, we decompose the given image by dyadic wavelet transform and form an image approximation with higher resolution, by doing so, we perform edge detection of given images efficiently. In the second, we introduce many operations of mathematical morphology to obtain morphological quantitative parameters of surface structure of cell wall-deficient EVC. The obtained results prove that the method can eliminate noise, detect the edge and extract the feature parameters validly. In this work, we have built automatic analytic software named "EVC.CELL".

  13. Development and installation of an automatic sample changer for neutron activation analysis

    International Nuclear Information System (INIS)

    Domienikan, Claudio; Lapolli, Andre L.; Schoueri, Roberto M.; Moreira, Edson G.; Vasconcellos, Marina B.A.

    2013-01-01

    A Programmable and Automatic Sample Changer was built and installed at the Neutron Activation Analysis Laboratory of the Nuclear and Energy Research Institute - IPEN-CNEN/SP, Brazil. This Automatic Sample Changer allows the fully automated measurement of up to 25 samples in one run. Basically it consists of an electronic circuit and C++ program that controls the positioning of a sample holder in two axes of motion (X and Y). Each sample is transported and positioned, one by one, inside the shielding coupled to a high-purity germanium (HPGe) radiation detector. A Canberra DSA-1000 Multichannel Analyzer coupled to the Genie 2000 software performs the data acquisition for analysis of the samples. When the counting is finished the results are saved in a hard disk of a PC computer. The sample is brought back by the sample holder to its initial position, and the next sample is carried to the shielding. The Sample Changer was designed and constructed at IPEN-CNEN/SP by employing national components and expertise. (author)

  14. Automatic Pedestrian Crossing Detection and Impairment Analysis Based on Mobile Mapping System

    Science.gov (United States)

    Liu, X.; Zhang, Y.; Li, Q.

    2017-09-01

    Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians' lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  15. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    Albrecht, R.W.; Crowe, R.D.; McGuire, J.J.

    1978-01-01

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  16. Automatic method of analysis and measurement of additional parameters of corneal deformation in the Corvis tonometer.

    Science.gov (United States)

    Koprowski, Robert

    2014-11-19

    The method for measuring intraocular pressure using the Corvis tonometer provides a sequence of images of corneal deformation. Deformations of the cornea are recorded using the ultra-high-speed Scheimpflug camera. This paper presents a new and reproducible method of analysis of corneal deformation images that allows for automatic measurements of new features, namely new three parameters unavailable in the original software. The images subjected to processing had a resolution of 200 × 576 × 140 pixels. They were acquired from the Corvis tonometer and simulation. In total 14,000 2D images were analysed. The image analysis method proposed by the author automatically detects the edge of the cornea and sclera fragments. For this purpose, new methods of image analysis and processing proposed by the author as well as those well-known, such as Canny filter, binarization, median filtering etc., have been used. The presented algorithms were implemented in Matlab (version 7.11.0.584-R2010b) with Image Processing toolbox (version 7.1-R2010b) using both known algorithms for image analysis and processing and those proposed by the author. Owing to the proposed algorithm it is possible to determine three parameters: (1) the degree of the corneal reaction relative to the static position; (2) the corneal length changes; (3) the ratio of amplitude changes to the corneal deformation length. The corneal reaction is smaller by about 30.40% compared to its static position. The change in the corneal length during deformation is very small, approximately 1% of its original length. Parameter (3) enables to determine the applanation points with a correlation of 92% compared to the conventional method for calculating corneal flattening areas. The proposed algorithm provides reproducible results fully automatically within a few seconds/per patient using Core i7 processor. Using the proposed algorithm, it is possible to measure new, additional parameters of corneal deformation, which

  17. Automatic mesh generation for structural analysis of pressure vessels using fuzzy knowledge processing

    International Nuclear Information System (INIS)

    Kado, Kenichiro; Sato, Takuya; Yoshimura, Shinobu; Yagawa, Genki.

    1994-01-01

    This paper describes the automatic mesh generation system for 2D axisymmetric and 3D shell structures based on the fuzzy knowledge processing. In this system, an analysis model, i.e. a geometric model, is first defined using a conventional method for 2D structures and a commercial CAD system, Auto-CAD, for 3D shell structures. Nodes are then generated based on the fuzzy knowledge processing technique, well controlling the node density distribution over the whole analysis domain. Triangular elements are generated using the Delaunay triangulation technique. The triangular elements are converted to quadrilateral elements. The fundamental performances of the system are demonstrated through its application to typical components of a pressure vessel. (author)

  18. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    Science.gov (United States)

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Automatic scatter detection in fluorescence landscapes by means of spherical principal component analysis

    DEFF Research Database (Denmark)

    Kotwa, Ewelina Katarzyna; Jørgensen, Bo Munk; Brockhoff, Per B.

    2013-01-01

    In this paper, we introduce a new method, based on spherical principal component analysis (S‐PCA), for the identification of Rayleigh and Raman scatters in fluorescence excitation–emission data. These scatters should be found and eliminated as a prestep before fitting parallel factor analysis...... models to the data, in order to avoid model degeneracies. The work is inspired and based on a previous research, where scatter removal was automatic (based on a robust version of PCA called ROBPCA) and required no visual data inspection but appeared to be computationally intensive. To overcome...... this drawback, we implement the fast S‐PCA in the scatter identification routine. Moreover, an additional pattern interpolation step that complements the method, based on robust regression, will be applied. In this way, substantial time savings are gained, and the user's engagement is restricted to a minimum...

  20. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    Science.gov (United States)

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-08

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented.

  1. Semi-automatic system for UV images analysis of historical musical instruments

    Science.gov (United States)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  2. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    Science.gov (United States)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  3. AUTOMATIC FAULT RECOGNITION OF PHOTOVOLTAIC MODULES BASED ON STATISTICAL ANALYSIS OF UAV THERMOGRAPHY

    Directory of Open Access Journals (Sweden)

    D. Kim

    2017-08-01

    Full Text Available As a malfunctioning PV (Photovoltaic cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle. The proposed algorithm uses statistical analysis of thermal intensity (surface temperature characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  4. Automatic Segmenting Structures in MRI's Based on Texture Analysis and Fuzzy Logic

    Science.gov (United States)

    Kaur, Mandeep; Rattan, Munish; Singh, Pushpinder

    2017-12-01

    The purpose of this paper is to present the variational method for geometric contours which helps the level set function remain close to the sign distance function, therefor it remove the need of expensive re-initialization procedure and thus, level set method is applied on magnetic resonance images (MRI) to track the irregularities in them as medical imaging plays a substantial part in the treatment, therapy and diagnosis of various organs, tumors and various abnormalities. It favors the patient with more speedy and decisive disease controlling with lesser side effects. The geometrical shape, the tumor's size and tissue's abnormal growth can be calculated by the segmentation of that particular image. It is still a great challenge for the researchers to tackle with an automatic segmentation in the medical imaging. Based on the texture analysis, different images are processed by optimization of level set segmentation. Traditionally, optimization was manual for every image where each parameter is selected one after another. By applying fuzzy logic, the segmentation of image is correlated based on texture features, to make it automatic and more effective. There is no initialization of parameters and it works like an intelligent system. It segments the different MRI images without tuning the level set parameters and give optimized results for all MRI's.

  5. Feasibility and reproducibility of fetal lung texture analysis by Automatic Quantitative Ultrasound Analysis and correlation with gestational age.

    Science.gov (United States)

    Cobo, Teresa; Bonet-Carne, Elisenda; Martínez-Terrón, Mónica; Perez-Moreno, Alvaro; Elías, Núria; Luque, Jordi; Amat-Roldan, Ivan; Palacio, Montse

    2012-01-01

    To evaluate the feasibility and reproducibility of fetal lung texture analysis using a novel automatic quantitative ultrasound analysis and to assess its correlation with gestational age. Prospective cross-sectional observational study. To evaluate texture features, 957 left and right lung images in a 2D four-cardiac-chamber view plane were previously delineated from fetuses between 20 and 41 weeks of gestation. Quantification of lung texture was performed by the Automatic Quantitative Ultrasound Analysis (AQUA) software to extract image features. A standard learning approach composed of feature transformation and a regression model was used to evaluate the association between texture features and gestational age. The association between weeks of gestation and fetal lung texture quantified by the AQUA software presented a Pearson correlation of 0.97. The association was not influenced by delineation parameters such as region of interest (ROI) localization, ROI size, right/left lung selected or sonographic parameters such as ultrasound equipment or transducer used. Fetal lung texture analysis measured by the AQUA software demonstrated a strong correlation with gestational age. This supports further research to explore the use of this technology to the noninvasive prediction of fetal lung maturity. Copyright © 2012 S. Karger AG, Basel.

  6. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  7. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    Science.gov (United States)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  8. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  9. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  10. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    Science.gov (United States)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  11. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  12. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    Science.gov (United States)

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  13. Automatic facial pore analysis system using multi-scale pore detection.

    Science.gov (United States)

    Sun, J Y; Kim, S W; Lee, S H; Choi, J E; Ko, S J

    2017-08-01

    As facial pore widening and its treatments have become common concerns in the beauty care field, the necessity for an objective pore-analyzing system has been increased. Conventional apparatuses lack in usability requiring strong light sources and a cumbersome photographing process, and they often yield unsatisfactory analysis results. This study was conducted to develop an image processing technique for automatic facial pore analysis. The proposed method detects facial pores using multi-scale detection and optimal scale selection scheme and then extracts pore-related features such as total area, average size, depth, and the number of pores. Facial photographs of 50 subjects were graded by two expert dermatologists, and correlation analyses between the features and clinical grading were conducted. We also compared our analysis result with those of conventional pore-analyzing devices. The number of large pores and the average pore size were highly correlated with the severity of pore enlargement. In comparison with the conventional devices, the proposed analysis system achieved better performance showing stronger correlation with the clinical grading. The proposed system is highly accurate and reliable for measuring the severity of skin pore enlargement. It can be suitably used for objective assessment of the pore tightening treatments. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Automatic T1 bladder tumor detection by using wavelet analysis in cystoscopy images

    Science.gov (United States)

    Freitas, Nuno R.; Vieira, Pedro M.; Lima, Estevão; Lima, Carlos S.

    2018-02-01

    Correct classification of cystoscopy images depends on the interpreter’s experience. Bladder cancer is a common lesion that can only be confirmed by biopsying the tissue, therefore, the automatic identification of tumors plays a significant role in early stage diagnosis and its accuracy. To our best knowledge, the use of white light cystoscopy images for bladder tumor diagnosis has not been reported so far. In this paper, a texture analysis based approach is proposed for bladder tumor diagnosis presuming that tumors change in tissue texture. As is well accepted by the scientific community, texture information is more present in the medium to high frequency range which can be selected by using a discrete wavelet transform (DWT). Tumor enhancement can be improved by using automatic segmentation, since a mixing with normal tissue is avoided under ideal conditions. The segmentation module proposed in this paper takes advantage of the wavelet decomposition tree to discard poor texture information in such a way that both steps of the proposed algorithm segmentation and classification share the same focus on texture. Multilayer perceptron and a support vector machine with a stratified ten-fold cross-validation procedure were used for classification purposes by using the hue-saturation-value (HSV), red-green-blue, and CIELab color spaces. Performances of 91% in sensitivity and 92.9% in specificity were obtained regarding HSV color by using both preprocessing and classification steps based on the DWT. The proposed method can achieve good performance on identifying bladder tumor frames. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis.

  16. Automatic detection of noisy channels in fNIRS signal based on correlation analysis.

    Science.gov (United States)

    Guerrero-Mosquera, Carlos; Borragán, Guillermo; Peigneux, Philippe

    2016-09-15

    fNIRS signals can be contaminated by distinct sources of noise. While most of the noise can be corrected using digital filters, optimized experimental paradigms or pre-processing methods, few approaches focus on the automatic detection of noisy channels. In the present study, we propose a new method that detect automatically noisy fNIRS channels by combining the global correlations of the signal obtained from sliding windows (Cui et al., 2010) with correlation coefficients extracted experimental conditions defined by triggers. The validity of the method was evaluated on test data from 17 participants, for a total of 16 NIRS channels per subject, positioned over frontal, dorsolateral prefrontal, parietal and occipital areas. Additionally, the detection of noisy channels was tested in the context of different levels of cognitive requirement in a working memory N-back paradigm. Bad channels detection accuracy, defined as the proportion of bad NIRS channels correctly detected among the total number of channels examined, was close to 91%. Under different cognitive conditions the area under the Receiver Operating Curve (AUC) increased from 60.5% (global correlations) to 91.2% (local correlations). Our results show that global correlations are insufficient for detecting potentially noisy channels when the whole data signal is included in the analysis. In contrast, adding specific local information inherent to the experimental paradigm (e.g., cognitive conditions in a block or event-related design), improved detection performance for noisy channels. Also, we show that automated fNIRS channel detection can be achieved with high accuracy at low computational cost. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Identificação e quantificação de voláteis de café através de cromatografia gasosa de alta resolução / espectrometria de massas empregando um amostrador automático de "headspace" Identification and quantification of coffee volatile components through high resolution gas chromatoghaph/mass spectrometer using a headspace automatic sampler

    Directory of Open Access Journals (Sweden)

    Leonardo César AMSTALDEN

    2001-01-01

    Full Text Available Usando um amostrador automático, os "headspaces" de três marcas comerciais de café torrado e moído foram analisados qualitativa e quantitativamente quanto a composição dos voláteis responsáveis pelo aroma através da técnica de cromatografia gasosa/espectrometria de massas. Uma vez que a metodologia não envolveu isolamento ou concentração dos aromas, suas proporções naturais foram mantidas, além de simplificar o preparo das amostras. O emprego do amostrador automático permitiu também boa resolução dos picos cromatográficos sem o emprego de criogenia, contribuindo para redução no tempo de análise. Noventa e um componentes puderam ser identificados, sendo que alguns compostos conhecidos como presentes em café como o dimetilsulfeto, metional e furfuril mercaptana não foram detectados. Os voláteis presentes em maior concentração puderam ser quantificados com o auxílio de dois padrões internos. A técnica se provou viável, tanto para caracterização como para quantificação de voláteis de café.Employing an automatic headspace sampler, the headspaces of three commercial brands of ground roasted coffee were qualitatively and quantitatively analyzed by gas chromatography / mass spectrometry. Since the methodology did not involve aroma isolation or concentration, their natural proportions were maintained, providing a more accurate composition of the flavors, and simplifying sample preparation. The automatic sampler allowed good resolution of the chromatographic peaks without cryofocusing the samples at the head of the column during injection, reducing analysis time. Ninety one compounds were identified and some known coffee volatiles, such as dimethyl sulphide, methional and furfuryl mercaptan were not detected. The more concentrated volatiles could be identified using two internal standards. The technique proved viable, for both characterization and for quantification of coffee volatiles.

  18. Vol draadwerk

    African Journals Online (AJOL)

    Owner

    Die motto van Marius Crous se derde bundel,. Vol draadwerk (2012) is ontleen aan die vader van die psigoanalise, Sigmund Freud, wat lui: “Everywhere I go I find a poet has been there before me.” Vol draadwerk verskyn ses jaar ná sy vorige bundel, Aan 'n beentjie sit en kluif. (2006). Vir sy bundel, Brief uit die kolonies ...

  19. A computer program for automatic gamma-ray spectra analysis with isotope identification for the purpose of activation analysis

    International Nuclear Information System (INIS)

    Weigel, H.; Dauk, J.

    1974-01-01

    A FORTRAN IV program for a PDP-9 computer, with 16K storage capacity, is developed performing automatic analysis of complex gamma-spectra, taken with Ge/Li/ detectors. It searches for full energy peaks and evaluates the peak areas. The program features and automatically performed isotope identifiaction. It is written in such a flexible manner that after reactor irradiation, spectra from samples of any composition can be evaluated for activation analysis. The peak search rutin is based on the following criteria: the counting rate has to increase for two succesive channels; and the amplitude of the corresponding maximum has to be greater than/or equal to F 1 times the statistical error of the counting rate in the valley just before the maximum. In order to detect superimposed peaks, it is assumed that the dependence of FWHM on channel number is roughly approximated by a linear function, and the actual and''theoretical''FWHM values are compared. To determine the net peak area a Gaussian based function is fitted to each peak. The isotope identification is based on the procedure developed by ADAMS and DAMS. (T.G.)

  20. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  1. Fractal Analysis of Elastographic Images for Automatic Detection of Diffuse Diseases of Salivary Glands: Preliminary Results

    Directory of Open Access Journals (Sweden)

    Alexandru Florin Badea

    2013-01-01

    Full Text Available The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD. It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of “real-time” elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology.

  2. An automatic analysis of strain-depth profile in X-ray microdiffraction

    International Nuclear Information System (INIS)

    Lagomarsino, S.; Giannini, C.; Guagliardi, A.; Cedola, A.; Scarinci, F.; Aruta, C.

    2004-01-01

    The increased demanding of high spatial resolution analysis for local strain/stress measurements gave an impulse for developing new X-ray microdiffraction technique. In particular spatial resolution of the order of 100-300 nm can be recently obtained using an X-ray waveguides as optical element. However, the great number of datasets which must be acquired and analyzed to probe the full field of strain variations renders the high-resolution technique not suitable for systematic analysis. In this communication, we present a data treatment procedure for an automatic analysis of microdiffraction profiles measured with X-ray waveguide to obtain quantitative information about local strain variations. The presented procedure allows to extract a depth-dependent strain profile directly from the measured data to be used as initial guess for calculating the diffraction profile by means of the dynamical theory in the Takagi-Taupin recursive formalism. Then a Monte Carlo fitting data refinement is applied to optimize the strain profile

  3. Design and Simulation of Electrocardiogram Circuit with Automatic Analysis of ECG Signal

    Directory of Open Access Journals (Sweden)

    Tosin Jemilehin

    2016-10-01

    Full Text Available An electrocardiogram (ECG is the graphical record of bioelectric signal generated by the human body during cardiac cycle, it tells a lot about the medical status of an individual. A typical ECG waveform consist of the P, Q, R, S and T wave. The automatic ECG signal analysis comprises of using computational method/approach in extracting important features and classification of ECG waveform. This paper presents a concise ECG circuit design using an instrumentation amplifier and a band-pass passive filter. It also present the process involved in analysis of ECG signal. The first stage is the pre-filtering stage, followed by feature extraction of the signal. QRS complex is first extracted followed by P and T wave detection, also the FFT of the signal is also extracted. These features are fed into the classifier for proper classification. A pattern recognition neural network is used for classification, prior to the full deployment of the neural network, it is trained by pre-recorded ECG signal downloaded from the MIT/BIH Arrhythmias database. The neural network gave a satisfactory result with accuracy of around 87%.The whole ECG signal analysis is packaged into a MATLAB GUI for ease of use

  4. Towards semi-automatic rock mass discontinuity orientation and set analysis from 3D point clouds

    Science.gov (United States)

    Guo, Jiateng; Liu, Shanjun; Zhang, Peina; Wu, Lixin; Zhou, Wenhui; Yu, Yinan

    2017-06-01

    Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.

  5. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  6. An empirical analysis of the methodology of automatic imitation research in a strategic context.

    Science.gov (United States)

    Aczel, Balazs; Kekecs, Zoltan; Bago, Bence; Szollosi, Aba; Foldes, Andrei

    2015-08-01

    Since the discovery of the mirror neuron system, it has been proposed that the automatic tendency to copy observed actions exists in humans and that this mechanism might be responsible for a range of social behavior. A strong argument for automatic behavior can be made when actions are executed against motivation to do otherwise. Strategic games in which imitation is disadvantageous serve as ideal designs for studying the automatic nature of participants' behavior. Most recently, Belot, Crawford, and Heyes (2013) conducted an explorative study using a modified version of the Rock-Paper-Scissors game, and suggested that in the case of asynchrony in the execution of the gestures, automatic imitation can be observed early on after the opponent's presentation. In our study, we video recorded the games, which allowed us to examine the effect of delay on imitative behavior as well as the sensitivity of the previously employed analyses. The examination of the recorded images revealed that more than 80% of the data were irrelevant to the study of automatic behavior. Additional bias in the paradigm became apparent, as previously presented gestures were found to affect the behavior of the players. After noise filtering, we found no evidence of automatic imitation in either the whole filtered data set or in selected time windows based on delay length. Besides questioning the strength of the results of previous analyses, we propose several experimental and statistical modifications for further research on automatic imitation. (c) 2015 APA, all rights reserved).

  7. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  8. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  9. On automatic data processing and well-test analysis in real-time reservoir management applications

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Stig

    2011-06-15

    The use of pressure and rate sensors for continuous measurements in the oil and gas wells are becoming more common. This provides better and more measurements in real time that can be analyzed to optimize the extraction of oil and gas. An analysis which can provide valuable information on oil and gas production, is transient analysis. In transient analysis pressure build-up in a well when it closed in are analyzed and parameters that describe the flow of oil and gas in the reservoir is estimated. However, it is very time consuming to manage and analyze real-time data and the result is often that only a limited amount of the available data are analyzed. It is therefore desirable to have more effective methods to analyze real time data from oil and gas wells. Olsen automated transient analysis in order to extract the information of real-time data in an efficient and labor-saving manner. The analysis must be initialized with well and reservoir-specific data, but when this is done, the analysis is performed automatically each time the well is closed in. For each shut-in are parameters that describe the flow of oil and gas in the reservoir estimated. By repeated shut, it will then appear time series of estimated parameters. One of the goals of the automated transient analysis lights up is to detect any changes in these time series so that the focus of the engineers can aim on the analysis results that deviate from normal. As part of this work it was also necessary to develop automated data filters for noise removal and data compression. The filter is designed so that it continuously filters the data using methods that are optimized for use on the typical pressure and rate signals measured in the oil and gas wells. The thesis shows Olsen examples of the use of automated data filtering and automated transient analysis of both synthetic data and real data from a field in the North Sea. (AG)

  10. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NARCIS (Netherlands)

    Liu, C.; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdinand

    2015-01-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared

  11. Neutron activation analysis of regolith delivered by the ''Luna-20'' automatic vehicle

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Kirnozov, F.F.; Kolesov, Y.M.; Ryvkin, B.N.; Surkov, Yu.A.; Shpanov, A.P.

    1975-01-01

    Results of neutron activation analysis of the regolite brought to earth by the automatic, space station ''Luna-20'' are reported. Two modifications of neutron activation method have been one being an instrumental one and the other a modification employing a chemical decomposition of the sample. The intensity and spectral composition of the irradiated and the reference samples have been measured with a Ge(Li) gamma spectrometer making used of a Ge(Li) detector and a 4096-channel pulse analyser fitted with a computer. The sensitive volume of the detector has been 65.5 cm, its resolution power along the 1332.5 KeV Co-60 line being about 4.5 KeV. The energy calibration of the device is carried out by Co-57, Sn-113, Cs-137 and Co-60 reference sources. The data reported, and in particular the low iron, chromium, potassium, rare-earth elements and the high calcium, aluminium and sodium contents, characterize the continental regolite as an anartosite rock while the marine regolite is chiefly a basalt rock

  12. Making computers noble. An experiment in automatic analysis of medieval texts

    Directory of Open Access Journals (Sweden)

    Andrea Colli

    2016-02-01

    Full Text Available L’analisi informatica di testi filosofici, la creazione di database, ipertesti o edizioni elettroniche non costituiscono più unicamente una ricerca di frontiera, ma sono da molti anni una risorsa preziosa per gli studi umanistici. Ora, non si tratta di richiedere alle macchine un ulteriore sforzo per comprendere il linguaggio umano, quanto piuttosto di perfezionare gli strumenti affinché esse possano essere a tutti gli effetti collaboratori di ricerca. Questo articolo è concepito come il resoconto di un esperimento finalizzato a documentare come le associazioni lessicali di un gruppo selezionato di testi medievali possa offrire qualche suggerimento in merito ai loro contenuti teorici. Computer analysis of texts, creation of databases hypertexts and digital editions are not the final frontier of research anymore. Quite the contrary, from many years they have been representing a significant contribution to medieval studies. Therefore, we do not mean to make the computer able to grasp the meaning of human language and penetrate its secrets, but rather we aim at improving their tools, so that they will become an even more efficient equipment employed in research activities. This paper is thought as a sort of technical report with the proposed task to verify if an automatic identification of some word associations within a selected groups of medieval writings produces suggestions on the subject of the processed texts, able to be used in a theoretical inquiry.

  13. Automatic extraction of faults and fractal analysis from remote sensing data

    Directory of Open Access Journals (Sweden)

    R. Gloaguen

    2007-01-01

    Full Text Available Object-based classification is a promising technique for image classification. Unlike pixel-based methods, which only use the measured radiometric values, the object-based techniques can also use shape and context information of scene textures. These extra degrees of freedom provided by the objects allow the automatic identification of geological structures. In this article, we present an evaluation of object-based classification in the context of extraction of geological faults. Digital elevation models and radar data of an area near Lake Magadi (Kenya have been processed. We then determine the statistics of the fault populations. The fractal dimensions of fault dimensions are similar to fractal dimensions directly measured on remote sensing images of the study area using power spectra (PSD and variograms. These methods allow unbiased statistics of faults and help us to understand the evolution of the fault systems in extensional domains. Furthermore, the direct analysis of image texture is a good indicator of the fault statistics and allows us to classify the intensity and type of deformation. We propose that extensional fault networks can be modeled by iterative function system (IFS.

  14. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  15. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Directory of Open Access Journals (Sweden)

    Christian Held

    2013-01-01

    Full Text Available Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline′s modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  16. Risk analysis of Leksell Gamma Knife Model C with automatic positioning system

    International Nuclear Information System (INIS)

    Goetsch, Steven J.

    2002-01-01

    Purpose: This study was conducted to evaluate the decrease in risk from misadministration of the new Leksell Gamma Knife Model C with Automatic Positioning System compared with previous models. Methods and Materials: Elekta Instruments, A.B. of Stockholm has introduced a new computer-controlled Leksell Gamma Knife Model C which uses motor-driven trunnions to reposition the patient between isocenters (shots) without human intervention. Previous models required the operators to manually set coordinates from a printed list, permitting opportunities for coordinate transposition, incorrect helmet size, incorrect treatment times, missing shots, or repeated shots. Results: A risk analysis was conducted between craniotomy involving hospital admission and outpatient Gamma Knife radiosurgery. A report of the Institute of Medicine of the National Academies dated November 29, 1999 estimated that medical errors kill between 44,000 and 98,000 people each year in the United States. Another report from the National Nosocomial Infections Surveillance System estimates that 2.1 million nosocomial infections occur annually in the United States in acute care hospitals alone, with 31 million total admissions. Conclusions: All medical procedures have attendant risks of morbidity and possibly mortality. Each patient should be counseled as to the risk of adverse effects as well as the likelihood of good results for alternative treatment strategies. This paper seeks to fill a gap in the existing medical literature, which has a paucity of data involving risk estimates for stereotactic radiosurgery

  17. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  18. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  19. Standard test methods for determining average grain size using semiautomatic and automatic image analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2015-01-01

    1.1 These test methods are used to determine grain size from measurements of grain intercept lengths, intercept counts, intersection counts, grain boundary length, and grain areas. 1.2 These measurements are made with a semiautomatic digitizing tablet or by automatic image analysis using an image of the grain structure produced by a microscope. 1.3 These test methods are applicable to any type of grain structure or grain size distribution as long as the grain boundaries can be clearly delineated by etching and subsequent image processing, if necessary. 1.4 These test methods are applicable to measurement of other grain-like microstructures, such as cell structures. 1.5 This standard deals only with the recommended test methods and nothing in it should be construed as defining or establishing limits of acceptability or fitness for purpose of the materials tested. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user ...

  20. Automatic analysis and characterization of the hummingbird wings motion using dense optical flow features

    International Nuclear Information System (INIS)

    Martínez, Fabio; Romero, Eduardo; Manzanera, Antoine

    2015-01-01

    A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of 0.31 rad/frame 2 and 1.9 (rad/frame) 2 for the global angular acceleration and the global wing deformation respectively. (paper)

  1. Automatic analysis of selected choroidal diseases in OCT images of the eye fundus

    Science.gov (United States)

    2013-01-01

    Introduction This paper describes a method for automatic analysis of the choroid in OCT images of the eye fundus in ophthalmology. The problem of vascular lesions occurs e.g. in a large population of patients having diabetes or macular degeneration. Their correct diagnosis and quantitative assessment of the treatment progress are a critical part of the eye fundus diagnosis. Material and method The study analysed about 1’000 OCT images acquired using SOCT Copernicus (Optopol Tech. SA, Zawiercie, Poland). The proposed algorithm for image analysis enabled to analyse the texture of the choroid portion located beneath the RPE (Retinal Pigment Epithelium) layer. The analysis was performed using the profiled algorithm based on morphological analysis and texture analysis and a classifier in the form of decision trees. Results The location of the centres of gravity of individual objects present in the image beneath the RPE layer proved to be important in the evaluation of different types of images. In addition, the value of the standard deviation and the number of objects in a scene were equally important. These features enabled classification of three different forms of the choroid that were related to retinal pathology: diabetic edema (the classification gave accuracy ACC1 = 0.73), ischemia of the inner retinal layers (ACC2 = 0.83) and scarring fibro vascular tissue (ACC3 = 0.69). For the cut decision tree the results were as follows: ACC1 = 0.76, ACC2 = 0.81, ACC3 = 0.68. Conclusions The created decision tree enabled to obtain satisfactory results of the classification of three types of choroidal imaging. In addition, it was shown that for the assumed characteristics and the developed classifier, the location of B-scan does not significantly affect the results. The image analysis method for texture analysis presented in the paper confirmed its usefulness in choroid imaging. Currently the application is further studied in the Clinical Department

  2. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  3. Automatic Decision Support for Clinical Diagnostic Literature Using Link Analysis in a Weighted Keyword Network.

    Science.gov (United States)

    Li, Shuqing; Sun, Ying; Soergel, Dagobert

    2017-12-23

    We present a novel approach to recommending articles from the medical literature that support clinical diagnostic decision-making, giving detailed descriptions of the associated ideas and principles. The specific goal is to retrieve biomedical articles that help answer questions of a specified type about a particular case. Based on the filtered keywords, MeSH(Medical Subject Headings) lexicon and the automatically extracted acronyms, the relationship between keywords and articles was built. The paper gives a detailed description of the process of by which keywords were measured and relevant articles identified based on link analysis in a weighted keywords network. Some important challenges identified in this study include the extraction of diagnosis-related keywords and a collection of valid sentences based on the keyword co-occurrence analysis and existing descriptions of symptoms. All data were taken from medical articles provided in the TREC (Text Retrieval Conference) clinical decision support track 2015. Ten standard topics and one demonstration topic were tested. In each case, a maximum of five articles with the highest relevance were returned. The total user satisfaction of 3.98 was 33% higher than average. The results also suggested that the smaller the number of results, the higher the average satisfaction. However, a few shortcomings were also revealed since medical literature recommendation for clinical diagnostic decision support is so complex a topic that it cannot be fully addressed through the semantic information carried solely by keywords in existing descriptions of symptoms. Nevertheless, the fact that these articles are actually relevant will no doubt inspire future research.

  4. Automatic extraction of soft tissues from 3D MRI head images using model driven analysis

    International Nuclear Information System (INIS)

    Jiang, Hao; Yamamoto, Shinji; Imao, Masanao.

    1995-01-01

    This paper presents an automatic extraction system (called TOPS-3D : Top Down Parallel Pattern Recognition System for 3D Images) of soft tissues from 3D MRI head images by using model driven analysis algorithm. As the construction of system TOPS we developed, two concepts have been considered in the design of system TOPS-3D. One is the system having a hierarchical structure of reasoning using model information in higher level, and the other is a parallel image processing structure used to extract plural candidate regions for a destination entity. The new points of system TOPS-3D are as follows. (1) The TOPS-3D is a three-dimensional image analysis system including 3D model construction and 3D image processing techniques. (2) A technique is proposed to increase connectivity between knowledge processing in higher level and image processing in lower level. The technique is realized by applying opening operation of mathematical morphology, in which a structural model function defined in higher level by knowledge representation is immediately used to the filter function of opening operation as image processing in lower level. The system TOPS-3D applied to 3D MRI head images consists of three levels. First and second levels are reasoning part, and third level is image processing part. In experiments, we applied 5 samples of 3D MRI head images with size 128 x 128 x 128 pixels to the system TOPS-3D to extract the regions of soft tissues such as cerebrum, cerebellum and brain stem. From the experimental results, the system is robust for variation of input data by using model information, and the position and shape of soft tissues are extracted corresponding to anatomical structure. (author)

  5. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  6. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  7. Automatization of the neutron activation analysis method in the nuclear analysis laboratory

    International Nuclear Information System (INIS)

    Gonzalez, N.R.; Rivero, D del C.; Gonzalez, M.A.; Larramendi, F.

    1993-01-01

    In the present paper the work done to automatice the Neutron Activation Analysis technic with a neutron generator is described. An interface between an IBM compatible microcomputer and the equipment in use to make this kind of measurement was developed. including the specialized software for this system

  8. Brain activity across the development of automatic categorization: A comparison of categorization tasks using multi-voxel pattern analysis

    Science.gov (United States)

    Soto, Fabian A.; Waldschmidt, Jennifer G.; Helie, Sebastien; Ashby, F. Gregory

    2013-01-01

    Previous evidence suggests that relatively separate neural networks underlie initial learning of rule-based and information-integration categorization tasks. With the development of automaticity, categorization behavior in both tasks becomes increasingly similar and exclusively related to activity in cortical regions. The present study uses multi-voxel pattern analysis to directly compare the development of automaticity in different categorization tasks. Each of three groups of participants received extensive training in a different categorization task: either an information-integration task, or one of two rule-based tasks. Four training sessions were performed inside an MRI scanner. Three different analyses were performed on the imaging data from a number of regions of interest (ROIs). The common patterns analysis had the goal of revealing ROIs with similar patterns of activation across tasks. The unique patterns analysis had the goal of revealing ROIs with dissimilar patterns of activation across tasks. The representational similarity analysis aimed at exploring (1) the similarity of category representations across ROIs and (2) how those patterns of similarities compared across tasks. The results showed that common patterns of activation were present in motor areas and basal ganglia early in training, but only in the former later on. Unique patterns were found in a variety of cortical and subcortical areas early in training, but they were dramatically reduced with training. Finally, patterns of representational similarity between brain regions became increasingly similar across tasks with the development of automaticity. PMID:23333700

  9. The Swiss-Army-Knife Approach to the Nearly Automatic Analysis for Microearthquake Sequences.

    Science.gov (United States)

    Kraft, T.; Simon, V.; Tormann, T.; Diehl, T.; Herrmann, M.

    2017-12-01

    Many Swiss earthquake sequence have been studied using relative location techniques, which often allowed to constrain the active fault planes and shed light on the tectonic processes that drove the seismicity. Yet, in the majority of cases the number of located earthquakes was too small to infer the details of the space-time evolution of the sequences, or their statistical properties. Therefore, it has mostly been impossible to resolve clear patterns in the seismicity of individual sequences, which are needed to improve our understanding of the mechanisms behind them. Here we present a nearly automatic workflow that combines well-established seismological analysis techniques and allows to significantly improve the completeness of detected and located earthquakes of a sequence. We start from the manually timed routine catalog of the Swiss Seismological Service (SED), which contains the larger events of a sequence. From these well-analyzed earthquakes we dynamically assemble a template set and perform a matched filter analysis on the station with: the best SNR for the sequence; and a recording history of at least 10-15 years, our typical analysis period. This usually allows us to detect events several orders of magnitude below the SED catalog detection threshold. The waveform similarity of the events is then further exploited to derive accurate and consistent magnitudes. The enhanced catalog is then analyzed statistically to derive high-resolution time-lines of the a- and b-value and consequently the occurrence probability of larger events. Many of the detected events are strong enough to be located using double-differences. No further manual interaction is needed; we simply time-shift the arrival-time pattern of the detecting template to the associated detection. Waveform similarity assures a good approximation of the expected arrival-times, which we use to calculate event-pair arrival-time differences by cross correlation. After a SNR and cycle-skipping quality

  10. Regulatory analysis for the resolution of Generic Issue 125.II.7 ''Reevaluate Provision to Automatically Isolate Feedwater from Steam Generator During a Line Break''

    International Nuclear Information System (INIS)

    Basdekas, D.L.

    1988-09-01

    Generic Issue 125.II.7 addresses the concern related to the automatic isolation of auxiliary feedwater (AFW) to a steam generator with a broken steam or feedwater line. This regulatory analysis provides a quantitative assessment of the costs and benefits associated with the removal of the AFW automatic isolation and concludes that no new regulatory requirements are warranted. 21 refs., 7 tabs

  11. Automatic detection of outlines. Application to the quantitative analysis of renal scintiscanning pictures

    International Nuclear Information System (INIS)

    Morcos-Ghrab, Nadia.

    1979-01-01

    The purpose of the work described is the finalizing of a method making it possible automatically to extract the significant outlines on a renal scintiscanning picture. The algorithms must be simple and of high performance, their routine execution on a mini-computer must be fast enough to compete effectively with human performances. However, the method that has been developed is general enough to be adapted, with slight modifications, to another type of picture. The first chapter is a brief introduction to the principle of scintiscanning, the equipment used and the type of picture obtained therefrom. In the second chapter the various approaches used for form recognition and scene analysis are very briefly described with the help of examples. The third chapter deals with pretreatment techniques (particularly the machine operators) used for segmenting the pictures. Chapter four presents techniques which segment the picture by parallel processing of all its points. In chapter five a description is given of the sequential research techniques of the outline elements, drawing inspiration from the methods used in artificial intelligence for resolving the optimization problem. The sixth chapter shows the difficulties encountered in extracting the renal outlines and the planning technique stages adopted to overcome these difficulties. Chapter seven describes in detail the two research methods employed for generating the plan. In chapter eight, the methods used for extending the areas obtained on the plan and for refining the outlines that bound them are dealt with. Chapter nine is a short presentation of the organization of the programmes and of their data structure. Finally, examples of results are given in chapter ten [fr

  12. Comparative Analysis of Music Recordings from Western and Non-Western traditions by Automatic Tonal Feature Extraction

    Directory of Open Access Journals (Sweden)

    Emilia Gómez

    2008-09-01

    Full Text Available The automatic analysis of large musical corpora by means of computational models overcomes some limitations of manual analysis, and the unavailability of scores for most existing music makes necessary to work with audio recordings. Until now, research on this area has focused on music from the Western tradition. Nevertheless, we might ask if the available methods are suitable when analyzing music from other cultures. We present an empirical approach to the comparative analysis of audio recordings, focusing on tonal features and data mining techniques. Tonal features are related to the pitch class distribution, pitch range and employed scale, gamut and tuning system. We provide our initial but promising results obtained when trying to automatically distinguish music from Western and non- Western traditions; we analyze which descriptors are most relevant and study their distribution over 1500 pieces from different traditions and styles. As a result, some feature distributions differ for Western and non-Western music, and the obtained classification accuracy is higher than 80% for different classification algorithms and an independent test set. These results show that automatic description of audio signals together with data mining techniques provide means to characterize huge music collections from different traditions and complement musicological manual analyses.

  13. Development of user interface to support automatic program generation of nuclear power plant analysis by module-based simulation system

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Mizutani, Naoki; Nakaya, Ken-ichiro; Wakabayashi, Jiro

    1988-01-01

    Module-based Simulation System (MSS) has been developed to realize a new software work environment enabling versatile dynamic simulation of a complex nuclear power system flexibly. The MSS makes full use of modern software technology to replace a large fraction of human software works in complex, large-scale program development by computer automation. Fundamental methods utilized in MSS and developmental study on human interface system SESS-1 to help users in generating integrated simulation programs automatically are summarized as follows: (1) To enhance usability and 'communality' of program resources, the basic mathematical models of common usage in nuclear power plant analysis are programed as 'modules' and stored in a module library. The information on usage of individual modules are stored in module database with easy registration, update and retrieval by the interactive management system. (2) Target simulation programs and the input/output files are automatically generated with simple block-wise languages by a precompiler system for module integration purpose. (3) Working time for program development and analysis in an example study of an LMFBR plant thermal-hydraulic transient analysis was demonstrated to be remarkably shortened, with the introduction of an interface system SESS-1 developed as an automatic program generation environment. (author)

  14. Accuracy of coronary plaque detection and assessment of interobserver agreement for plaque quantification using automatic coronary plaque analysis software on coronary CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Laqmani, A.; Quitzke, M.; Creder, D.D.; Adam, G.; Lund, G. [University Medical Center Hamburg-Eppendorf, Hamburg (Germany). Dept. of Diagnostic and Interventional Radiology and Nuclearmedicine; Klink, T. [Wuerzburg Univ. (Germany). Inst. of Diagnostic and Interventional Radiology

    2016-10-15

    To evaluate the accuracy of automatic plaque detection and the interobserver agreement of automatic versus manually adjusted quantification of coronary plaques on coronary CT angiography (cCTA) using commercially available software. 10 cCTA datasets were evaluated using plaque software. First, the automatically detected plaques were verified. Second, two observers independently performed plaque quantification without revising the automatically constructed plaque contours (automatic approach). Then, each observer adjusted the plaque contours according to plaque delineation (adjusted approach). The interobserver agreement of both approaches was analyzed. 32 of 114 automatically identified findings were true-positive plaques, while 82 (72 %) were false-positive. 20 of 52 plaques (38 %) were missed by the software (false-negative). The automatic approach provided good interobserver agreement with relative differences of 0.9 ± 16.0 % for plaque area and -3.3 ± 33.8 % for plaque volume. Both observers independently adjusted all contours because they did not represent the plaque delineation. Interobserver agreement decreased for the adjusted approach with relative differences of 25.0 ± 24.8 % for plaque area and 20.0 ± 40.4 % for plaque volume. The automatic plaque analysis software is of limited value due to high numbers of false-positive and false-negative plaque findings. The automatic approach was reproducible but it necessitated adjustment of all constructed plaque contours resulting in deterioration of the interobserver agreement.

  15. MECHANICAL PARAMETERS CONTROL OF THE NEUTRAL RELAY OF RAIL AUTOMATICS BASED ON WAVELET ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. I. Havryliuk

    2016-12-01

    Full Text Available Purpose. The scientific paper focuses on development of a method for mechanical parameters control of the neutral relay of rail automatics by current analyzing in the coil and the relay contacts at switching based on wavelet transform. Methodology. The methodology was based on current analysis of the relay coil and contacts currents under its switching and analysis obtained results by using wavelet transform. Findings. The time dependences of the current in coil and in contacts under switching on and off for the operable relays and relays with certain defects have been measured at different voltages applied to coil (10, 12 и 14 V. When voltage applied to coil was increased the rate of coil current increased, but time constant of the circuit was changed with voltage slightly. The current value maximum at which the relay armature begun to move for the operable relays was depended on applied voltage slightly. For the time interval, at which armature was putted to the relay core fully, time constant of the circuit was changed with voltage slightly also. The maximum current value at which the armature starts to move to the relay serviceable little dependent on the applied voltage. For site based in which the anchor is completely dragged constant current rise time is also a little dependent on the applied voltage. Similar results were obtained for the current reduction time constant with turning off the voltage and current confinement. The current pick-up and the current release of the armature increases for the relay with the load anchor is proportional to the weight of cargo, as well as for the relay contact with the rear, bent down, while the flexural rear contact up these currents have smaller values. Large-scale (low-frequency coefficients of DWT can be used for a more accurate comparison of the current pick-up, and release time constants of the transient. Small scale coefficients fiberboard (HF can be used as distinguishing traits defects

  16. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT

    DEFF Research Database (Denmark)

    Murphy, Keelin; Pluim, Josien P. W.; Rikxoort, Eva M. van

    2012-01-01

    Purpose: To analyze pulmonary function using a fully automatic technique which processes pairs of thoracic CT scans acquired at breath-hold inspiration and expiration, respectively. The following research objectives are identified to: (a) describe and systematically analyze the processing pipeline...

  17. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    Science.gov (United States)

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  18. Analysis of individual classification of lameness using automatic measurement of back posture in dairy cattle

    NARCIS (Netherlands)

    Viazzi, S.; Schlageter Tello, A.A.; Hertem, van T.; Romanini, C.E.B.; Pluk, A.; Halachmi, I.; Lokhorst, C.; Berckmans, D.

    2013-01-01

    Currently, diagnosis of lameness at an early stage in dairy cows relies on visual observation by the farmer, which is time consuming and often omitted. Many studies have tried to develop automatic cow lameness detection systems. However, those studies apply thresholds to the whole population to

  19. Energy balance of a glacier surface: analysis of Automatic Weather Station data from the Morteratschgletscher, Switzerland

    NARCIS (Netherlands)

    Oerlemans, J.; Klok, E.J.

    2002-01-01

    We describe and analyze a complete 1-yr data set from an automatic weather station (AWS) located on the snout of the Morteratschgletscher, Switzerland. The AWS stands freely on the glacier surface and measures pressure, windspeed, wind direction, air temperature and humidity, incoming and

  20. Comparative analysis of automatic approaches to building detection from multi-source aerial data

    NARCIS (Netherlands)

    Frontoni, E.; Khoshelham, K.; Nardinocchi, C.; Nedkov, S.; Zingaretti, P.

    2008-01-01

    Automatic building detection has been a hot topic since the early 1990’s. Early approaches were based on a single aerial image. Detecting buildings is a difficult task so it can be more effective when multiple sources of information are obtained and fused. The objective of this paper is to provide a

  1. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Rastin, S.J.; Unsworth, C.P.; Bennet, L.

    2010-01-01

    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  2. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT.

    NARCIS (Netherlands)

    Murphy, K.; Pluim, J.P.; Rikxoort, E.M. van; Jong, P.A. de; Hoop, B.J. de; Gietema, H.A.; Mets, O.; Bruijne, M. de; Lo, P.; Prokop, M.; Ginneken, B. van

    2012-01-01

    PURPOSE: To analyze pulmonary function using a fully automatic technique which processes pairs of thoracic CT scans acquired at breath-hold inspiration and expiration, respectively. The following research objectives are identified to: (a) describe and systematically analyze the processing pipeline

  3. ANALYSIS OF TRACK CIRCUITS WORK AND AUTOMATIC SIGNALING ON PASS SECTION WITH STEEP GRADIENT

    Directory of Open Access Journals (Sweden)

    A. P. Razgonov

    2011-01-01

    Full Text Available Functioning of the track circuits and automatic block system on a pass section with a steep profile has been investigated. The traction calculations, the thermal operating modes of choke transformers, the operating regimes of track circuits with installing two choke transformers on supply and relay ends have been determined.

  4. Advances on Automatic Speech Analysis for Early Detection of Alzheimer Disease: A Non-linear Multi-task Approach.

    Science.gov (United States)

    Lopez-de-Ipina, Karmele; Martinez-de-Lizarduy, Unai; Calvo, Pilar M; Mekyska, Jiri; Beitia, Blanca; Barroso, Nora; Estanga, Ainara; Tainta, Milkel; Ecay-Torres, Mirian

    2018-01-01

    Nowadays proper detection of cognitive impairment has become a challenge for the scientific community. Alzheimer's Disease (AD), the most common cause of dementia, has a high prevalence that is increasing at a fast pace towards epidemic level. In the not-so-distant future this fact could have a dramatic social and economic impact. In this scenario, an early and accurate diagnosis of AD could help to decrease its effects on patients, relatives and society. Over the last decades there have been useful advances not only in classic assessment techniques, but also in novel non-invasive screening methodologies. Among these methods, automatic analysis of speech -one of the first damaged skills in AD patients- is a natural and useful low cost tool for diagnosis. In this paper a non-linear multi-task approach based on automatic speech analysis is presented. Three tasks with different language complexity levels are analyzed, and promising results that encourage a deeper assessment are obtained. Automatic classification was carried out by using classic Multilayer Perceptron (MLP) and Deep Learning by means of Convolutional Neural Networks (CNN) (biologically- inspired variants of MLPs) over the tasks with classic linear features, perceptual features, Castiglioni fractal dimension and Multiscale Permutation Entropy. Finally, the most relevant features are selected by means of the non-parametric Mann- Whitney U-test. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    Science.gov (United States)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  6. Automatic classification of retinal three-dimensional optical coherence tomography images using principal component analysis network with composite kernels.

    Science.gov (United States)

    Fang, Leyuan; Wang, Chong; Li, Shutao; Yan, Jun; Chen, Xiangdong; Rabbani, Hossein

    2017-11-01

    We present an automatic method, termed as the principal component analysis network with composite kernel (PCANet-CK), for the classification of three-dimensional (3-D) retinal optical coherence tomography (OCT) images. Specifically, the proposed PCANet-CK method first utilizes the PCANet to automatically learn features from each B-scan of the 3-D retinal OCT images. Then, multiple kernels are separately applied to a set of very important features of the B-scans and these kernels are fused together, which can jointly exploit the correlations among features of the 3-D OCT images. Finally, the fused (composite) kernel is incorporated into an extreme learning machine for the OCT image classification. We tested our proposed algorithm on two real 3-D spectral domain OCT (SD-OCT) datasets (of normal subjects and subjects with the macular edema and age-related macular degeneration), which demonstrated its effectiveness. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  7. Automatic teeth axes calculation for well-aligned teeth using cost profile analysis along teeth center arch.

    Science.gov (United States)

    Kim, Gyehyun; Lee, Jeongjin; Seo, Jinwook; Lee, Wooshik; Shin, Yeong-Gil; Kim, Bohyoung

    2012-04-01

    In dental implantology and virtual dental surgery planning using computed tomography (CT) images, the examination of the axes of neighboring and/or biting teeth is important to improve the performance of the masticatory system as well as the aesthetic beauty. However, due to its high connectivity to neighboring teeth and jawbones, a tooth and/or its axis is very elusive to automatically identify in dental CT images. This paper presents a novel method of automatically calculating individual teeth axes. The planes separating the individual teeth are automatically calculated using cost profile analysis along the teeth center arch. In this calculation, a novel plane cost function, which considers the intensity and the gradient, is proposed to favor the teeth separation planes crossing the teeth interstice and suppress the possible inappropriately detected separation planes crossing the soft pulp. The soft pulp and dentine of each individually separated tooth are then segmented by a fast marching method with two newly proposed speed functions considering their own specific anatomical characteristics. The axis of each tooth is finally calculated using principal component analysis on the segmented soft pulp and dentine. In experimental results using 20 clinical datasets, the average angle and minimum distance differences between the teeth axes manually specified by two dentists and automatically calculated by the proposed method were 1.94° ± 0.61° and 1.13 ± 0.56 mm, respectively. The proposed method identified the individual teeth axes accurately, demonstrating that it can give dentists substantial assistance during dental surgery such as dental implant placement and orthognathic surgery.

  8. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Rubén Fernández Pozo

    2009-01-01

    Full Text Available This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA. Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  9. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    Science.gov (United States)

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals.

  10. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  11. A novel automatic film changer for high-speed analysis of nuclear emulsions

    International Nuclear Information System (INIS)

    Borer, K.; Damet, J.; Hess, M.; Kreslo, I.; Moser, U.; Pretzl, K.; Savvinov, N.; Schuetz, H.-U.; Waelchli, T.; Weber, M.

    2006-01-01

    This paper describes the recent development of a novel automatic computer-controlled manipulator for emulsion sheet placement and removal at the microscope object table (also called stage). The manipulator is designed for mass scanning of emulsions for the OPERA neutrino oscillation experiment and provides emulsion changing time shorter than 30s with an emulsion sheet positioning accuracy as good as 20μm RMS

  12. Automatic untargeted metabolic profiling analysis coupled with Chemometrics for improving metabolite identification quality to enhance geographical origin discrimination capability.

    Science.gov (United States)

    Han, Lu; Zhang, Yue-Ming; Song, Jing-Jing; Fan, Mei-Juan; Yu, Yong-Jie; Liu, Ping-Ping; Zheng, Qing-Xia; Chen, Qian-Si; Bai, Chang-Cai; Sun, Tao; She, Yuan-Bin

    2018-03-16

    Untargeted metabolic profiling analysis is employed to screen metabolites for specific purposes, such as geographical origin discrimination. However, the data analysis remains a challenging task. In this work, a new automatic untargeted metabolic profiling analysis coupled with a chemometric strategy was developed to improve the metabolite identification results and to enhance the geographical origin discrimination capability. Automatic untargeted metabolic profiling analysis with chemometrics (AuMPAC) was used to screen the total ion chromatographic (TIC) peaks that showed significant differences among the various geographical regions. Then, a chemometric peak resolution strategy is employed for the screened TIC peaks. The retrieved components were further analyzed using ANOVA, and those that showed significant differences were used to build a geographical origin discrimination model by using two-way encoding partial least squares. To demonstrate its performance, a geographical origin discrimination of flaxseed samples from six geographical regions in China was conducted, and 18 TIC peaks were screened. A total of 19 significant different metabolites were obtained after the peak resolution. The accuracy of the geographical origin discrimination was up to 98%. A comparison of the AuMPAC, AMDIS, and XCMS indicated that AuMPACobtained the best geographical origin discrimination results. In conclusion, AuMPAC provided another method for data analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    International Nuclear Information System (INIS)

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  14. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard, M.A.; Sommer, S.C. [Lawrence Livermore National Lab., CA (United States)

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  15. Texture analysis of automatic graph cuts segmentations for detection of lung cancer recurrence after stereotactic radiotherapy

    Science.gov (United States)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2015-03-01

    Stereotactic ablative radiotherapy (SABR) is a treatment for early-stage lung cancer with local control rates comparable to surgery. After SABR, benign radiation induced lung injury (RILI) results in tumour-mimicking changes on computed tomography (CT) imaging. Distinguishing recurrence from RILI is a critical clinical decision determining the need for potentially life-saving salvage therapies whose high risks in this population dictate their use only for true recurrences. Current approaches do not reliably detect recurrence within a year post-SABR. We measured the detection accuracy of texture features within automatically determined regions of interest, with the only operator input being the single line segment measuring tumour diameter, normally taken during the clinical workflow. Our leave-one-out cross validation on images taken 2-5 months post-SABR showed robustness of the entropy measure, with classification error of 26% and area under the receiver operating characteristic curve (AUC) of 0.77 using automatic segmentation; the results using manual segmentation were 24% and 0.75, respectively. AUCs for this feature increased to 0.82 and 0.93 at 8-14 months and 14-20 months post SABR, respectively, suggesting even better performance nearer to the date of clinical diagnosis of recurrence; thus this system could also be used to support and reinforce the physician's decision at that time. Based on our ongoing validation of this automatic approach on a larger sample, we aim to develop a computer-aided diagnosis system which will support the physician's decision to apply timely salvage therapies and prevent patients with RILI from undergoing invasive and risky procedures.

  16. Analysis of Relationship between Pioneer Brand Status and Consumer’s Attitude toward a Brand (Case on Yamaha Automatic vs. Honda Automatic Transmission Motorcycle in Indonesia

    Directory of Open Access Journals (Sweden)

    Arga Hananto

    2011-06-01

    Full Text Available Previous research have indicated that brand pioneership provide some advantages such as high market share barriers to entry and consumers preference as well as higher consumer attitude. This paper intends to explore the relationship between perceived brand pioneership on consumers’ brand attitude. The study focuses on two competing brands from automatic transmission motorcycle category, namely Yamaha and Honda.Based on result from 90 respondents, this study confirms the perception that Yamaha (although not the true pioneer is perceived by the majority of respondents as the pioneering brand in the automatic transmission motorcycle. This study also found that those respondents who perceived that Yamaha is the pioneer brand tend to ascribe higher brand attitude toward Yamaha than toward Honda. Result from this study adds to the repository of studies concerning brand pioneership as well as adding to repository of knowledge about Indonesian consumer behavior.

  17. Automatic categorization of Spanish texts into linguistic registers: a contrastive analysis

    Directory of Open Access Journals (Sweden)

    John Roberto Rodríguez

    2013-07-01

    Full Text Available Collaborative software such as Recommender Systems can benefit from the automatic classification of texts into linguistic registers. First, the linguistic register provides information about the users' profiles and the context of the recommendation. Second, considering the characteristics of each type of text can help to improve existing natural language processing methods. In this paper we contrast two approaches to register categorization for Spanish. The first approach is focused on morphosintactic patterns and the second one on lexical patterns. For the experimental evaluation we tested 38 machine learning algorithms with a precision higher than 89%.

  18. Computational text analysis and reading comprehension exam complexity towards automatic text classification

    CERN Document Server

    Liontou, Trisevgeni

    2014-01-01

    This book delineates a range of linguistic features that characterise the reading texts used at the B2 (Independent User) and C1 (Proficient User) levels of the Greek State Certificate of English Language Proficiency exams in order to help define text difficulty per level of competence. In addition, it examines whether specific reader variables influence test takers' perceptions of reading comprehension difficulty. The end product is a Text Classification Profile per level of competence and a formula for automatically estimating text difficulty and assigning levels to texts consistently and re

  19. A fast automatic plate changer for the analysis of nuclear emulsions

    Science.gov (United States)

    Balestra, S.; Bertolin, A.; Bozza, C.; Calligola, P.; Cerroni, R.; D'Ambrosio, N.; Degli Esposti, L.; De Lellis, G.; De Serio, M.; Di Capua, F.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Dusini, S.; Esposito, L. S.; Fini, R. A.; Giacomelli, G.; Giacomelli, R.; Grella, G.; Ieva, M.; Kose, U.; Longhin, A.; Mandrioli, G.; Mauri, N.; Medinaceli, E.; Monacelli, P.; Muciaccia, M. T.; Pasqualini, L.; Pastore, A.; Patrizii, L.; Pozzato, M.; Pupilli, F.; Rescigno, R.; Rosa, G.; Ruggieri, A.; Russo, A.; Sahnoun, Z.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Stellacci, S. M.; Strolin, P.; Tenti, M.; Tioukov, V.; Togo, V.; Valieri, C.

    2013-07-01

    This paper describes the design and performance of a computer controlled emulsion Plate Changer for the automatic placement and removal of nuclear emulsion films for the European Scanning System microscopes. The Plate Changer is used for mass scanning and measurement of the emulsions of the OPERA neutrino oscillation experiment at the Gran Sasso lab on the CNGS neutrino beam. Unlike other systems it works with both dry and oil objectives. The film changing takes less than 20 s and the accuracy on the positioning of the emulsion films is about 10 μm. The final accuracy in retrieving track coordinates after fiducial marks measurement is better than 1 μm.

  20. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Directory of Open Access Journals (Sweden)

    Jan Wieding

    Full Text Available The use of finite element analysis (FEA has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with

  1. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Science.gov (United States)

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  2. Automatic three-dimensional quantitative analysis for evaluation of facial movement.

    Science.gov (United States)

    Hontanilla, B; Aubá, C

    2008-01-01

    The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.

  3. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI

    Science.gov (United States)

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z.; Stone, Maureen; Prince, Jerry L.

    2014-01-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations. PMID:25155697

  4. Automatic geometric modeling, mesh generation and FE analysis for pipelines with idealized defects and arbitrary location

    Energy Technology Data Exchange (ETDEWEB)

    Motta, R.S.; Afonso, S.M.B.; Willmersdorf, R.B.; Lyra, P.R.M. [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Cabral, H.L.D. [TRANSPETRO, Rio de Janeiro, RJ (Brazil); Andrade, E.Q. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Although the Finite Element Method (FEM) has proved to be a powerful tool to predict the failure pressure of corroded pipes, the generation of good computational models of pipes with corrosion defects can take several days. This makes the use of computational simulation procedure difficult to apply in practice. The main purpose of this work is to develop a set of computational tools to produce automatically models of pipes with defects, ready to be analyzed with commercial FEM programs, starting from a few parameters that locate and provide the main dimensions of the defect or a series of defects. Here these defects can be internal and external and also assume general spatial locations along the pipe. Idealized rectangular and elliptic geometries can be generated. These tools were based on MSC.PATRAN pre and post-processing programs and were written with PCL (Patran Command Language). The program for the automatic generation of models (PIPEFLAW) has a simplified and customized graphical interface, so that an engineer with basic notions of computational simulation with the FEM can generate rapidly models that result in precise and reliable simulations. Some examples of models of pipes with defects generated by the PIPEFLAW system are shown, and the results of numerical analyses, done with the tools presented in this work, are compared with, empiric results. (author)

  5. Automatic Condition Monitoring of Industrial Rolling-Element Bearings Using Motor’s Vibration and Current Analysis

    DEFF Research Database (Denmark)

    Yang, Zhenyu

    2015-01-01

    An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated...... by the potential commercialization, the developed system is promoted mainly using off-the-shelf techniques, that is, the high-frequency resonance technique with envelope detection and the average of short-time Fourier transform. In order to test the flexibility and robustness, the monitoring performance...... is extensively studied under diverse operating conditions: different sensor locations, motor speeds, loading conditions, and data samples from different time segments. The experimental results showed the powerful capability of vibration analysis in the bearing point defect fault diagnosis. The current analysis...

  6. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    Science.gov (United States)

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.

  7. ProMoIJ: A new tool for automatic three-dimensional analysis of microglial process motility.

    Science.gov (United States)

    Paris, Iñaki; Savage, Julie C; Escobar, Laura; Abiega, Oihane; Gagnon, Steven; Hui, Chin-Wai; Tremblay, Marie-Ève; Sierra, Amanda; Valero, Jorge

    2018-04-01

    Microglia, the immune cells of the central nervous system, continuously survey the brain to detect alterations and maintain tissue homeostasis. The motility of microglial processes is indicative of their surveying capacity in normal and pathological conditions. The gold standard technique to study motility involves the use of two-photon microscopy to obtain time-lapse images from brain slices or the cortex of living animals. This technique generates four dimensionally-coded images which are analyzed manually using time-consuming, non-standardized protocols. Microglial process motility analysis is frequently performed using Z-stack projections with the consequent loss of three-dimensional (3D) information. To overcome these limitations, we developed ProMoIJ, a pack of ImageJ macros that perform automatic motility analysis of cellular processes in 3D. The main core of ProMoIJ is formed by two macros that assist the selection of processes, automatically reconstruct their 3D skeleton, and analyze their motility (process and tip velocity). Our results show that ProMoIJ presents several key advantages compared with conventional manual analysis: (1) reduces the time required for analysis, (2) is less sensitive to experimenter bias, and (3) is more robust to varying numbers of processes analyzed. In addition, we used ProMoIJ to demonstrate that commonly performed 2D analysis underestimates microglial process motility, to reveal that only cells adjacent to a laser injured area extend their processes toward the lesion site, and to demonstrate that systemic inflammation reduces microglial process motility. ProMoIJ is a novel, open-source, freely-available tool which standardizes and accelerates the time-consuming labor of 3D analysis of microglial process motility. © 2017 Wiley Periodicals, Inc.

  8. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    Directory of Open Access Journals (Sweden)

    Kemal Akyol

    2016-01-01

    Full Text Available With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  9. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    Science.gov (United States)

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  10. Automatic system for quantification and visualization of lung aeration on chest computed tomography images: the Lung Image System Analysis - LISA

    International Nuclear Information System (INIS)

    Felix, John Hebert da Silva; Cortez, Paulo Cesar; Holanda, Marcelo Alcantara

    2010-01-01

    High Resolution Computed Tomography (HRCT) is the exam of choice for the diagnostic evaluation of lung parenchyma diseases. There is an increasing interest for computational systems able to automatically analyze the radiological densities of the lungs in CT images. The main objective of this study is to present a system for the automatic quantification and visualization of the lung aeration in HRCT images of different degrees of aeration, called Lung Image System Analysis (LISA). The secondary objective is to compare LISA to the Osiris system and also to specific algorithm lung segmentation (ALS), on the accuracy of the lungs segmentation. The LISA system automatically extracts the following image attributes: lungs perimeter, cross sectional area, volume, the radiological densities histograms, the mean lung density (MLD) in Hounsfield units (HU), the relative area of the lungs with voxels with density values lower than -950 HU (RA950) and the 15th percentile of the least density voxels (PERC15). Furthermore, LISA has a colored mask algorithm that applies pseudo-colors to the lung parenchyma according to the pre-defined radiological density chosen by the system user. The lungs segmentations of 102 images of 8 healthy volunteers and 141 images of 11 patients with Chronic Obstructive Pulmonary Disease (COPD) were compared on the accuracy and concordance among the three methods. The LISA was more effective on lungs segmentation than the other two methods. LISA's color mask tool improves the spatial visualization of the degrees of lung aeration and the various attributes of the image that can be extracted may help physicians and researchers to better assess lung aeration both quantitatively and qualitatively. LISA may have important clinical and research applications on the assessment of global and regional lung aeration and therefore deserves further developments and validation studies. (author)

  11. Design of Low Power Algorithms for Automatic Embedded Analysis of Patch ECG Signals

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt

    , several different cable-free wireless patch-type ECG recorders have recently reached the market. One of these recorders is the ePatch designed by the Danish company DELTA. The extended monitoring period available with the patch recorders has demonstrated to increase the diagnostic yield of outpatient ECG...... monitoring. Furthermore, the patch recorders facilitate the possibility of outpatient ECG monitoring in new clinically relevant areas, e.g. telemedicine monitoring of cardiac patients in their homes. Some of these new applications could benefit from real-time embedded interpretation of the recorded ECGs....... Such algorithms could allow the real-time transmission of clinically relevant information to a central monitoring station. The first step in embedded ECG interpretation is the automatic detection of each individual heartbeat. An important part of this project was therefore to design a novel algorithm...

  12. An automatic detector of drowsiness based on spectral analysis and wavelet decomposition of EEG records.

    Science.gov (United States)

    Garces Correa, Agustina; Laciar Leber, Eric

    2010-01-01

    An algorithm to detect automatically drowsiness episodes has been developed. It uses only one EEG channel to differentiate the stages of alertness and drowsiness. In this work the vectors features are building combining Power Spectral Density (PDS) and Wavelet Transform (WT). The feature extracted from the PSD of EEG signal are: Central frequency, the First Quartile Frequency, the Maximum Frequency, the Total Energy of the Spectrum, the Power of Theta and Alpha bands. In the Wavelet Domain, it was computed the number of Zero Crossing and the integrated from the scale 3, 4 and 5 of Daubechies 2 order WT. The classifying of epochs is being done with neural networks. The detection results obtained with this technique are 86.5 % for drowsiness stages and 81.7% for alertness segment. Those results show that the features extracted and the classifier are able to identify drowsiness EEG segments.

  13. Automatic measurement and analysis of neonatal O2 consumption and CO2 production

    Science.gov (United States)

    Chang, Jyh-Liang; Luo, Ching-Hsing; Yeh, Tsu-Fuh

    1996-02-01

    It is difficult to estimate daily energy expenditure unless continuous O2 consumption (VO2) and CO2 production (VCO2) can be measured. This study describes a simple method for calculating daily and interim changes in O2 consumption and CO2 production for neonates, especially for premature infants. Oxygen consumption and CO2 production are measured using a flow-through technique in which the total VO2 and VCO2 over a given period of time are determined through a computerized system. This system can automatically calculate VO2 and VCO2 not only minute to minute but also over a period of time, e.g., 24 h. As a result, it provides a better indirect reflection of the accurate energy expenditure in an infant's daily life and can be used at the bedside of infants during their ongoing nursery care.

  14. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  15. Development of an Algorithm for Automatic Analysis of the Impedance Spectrum Based on a Measurement Model

    Science.gov (United States)

    Kobayashi, Kiyoshi; Suzuki, Tohru S.

    2018-03-01

    A new algorithm for the automatic estimation of an equivalent circuit and the subsequent parameter optimization is developed by combining the data-mining concept and complex least-squares method. In this algorithm, the program generates an initial equivalent-circuit model based on the sampling data and then attempts to optimize the parameters. The basic hypothesis is that the measured impedance spectrum can be reproduced by the sum of the partial-impedance spectra presented by the resistor, inductor, resistor connected in parallel to a capacitor, and resistor connected in parallel to an inductor. The adequacy of the model is determined by using a simple artificial-intelligence function, which is applied to the output function of the Levenberg-Marquardt module. From the iteration of model modifications, the program finds an adequate equivalent-circuit model without any user input to the equivalent-circuit model.

  16. Automatic image analysis methods for the determination of stereological parameters - application to the analysis of densification during solid state sintering of WC-Co compacts

    Science.gov (United States)

    Missiaen; Roure

    2000-08-01

    Automatic image analysis methods which were used to determine microstructural parameters of sintered materials are presented. Estimation of stereological parameters at interfaces, when the system contains more than two phases, is particularly detailed. It is shown that the specific surface areas and mean curvatures of the various interfaces can be estimated in the numerical space of the images. The methods are applied to the analysis of densification during solid state sintering of WC-Co compacts. The microstructural evolution is commented on. Application of microstructural measurements to the analysis of densification kinetics is also discussed.

  17. Quantitative right and left ventricular functional analysis during gated whole-chest MDCT: A feasibility study comparing automatic segmentation to semi-manual contouring

    International Nuclear Information System (INIS)

    Coche, Emmanuel; Walker, Matthew J.; Zech, Francis; Crombrugghe, Rodolphe de; Vlassenbroek, Alain

    2010-01-01

    Purpose: To evaluate the feasibility of an automatic, whole-heart segmentation algorithm for measuring global heart function from gated, whole-chest MDCT images. Material and methods: 15 patients with suspicion of PE underwent whole-chest contrast-enhanced MDCT with retrospective ECG synchronization. Two observers computed right and left ventricular functional indices using a semi-manual and an automatic whole-heart segmentation algorithm. The two techniques were compared using Bland-Altman analysis and paired Student's t-test. Measurement reproducibility was calculated using intraclass correlation coefficient. Results: Ventricular analysis with automatic segmentation was successful in 13/15 (86%) and in 15/15 (100%) patients for the right ventricle and left ventricle, respectively. Reproducibility of measurements for both ventricles was perfect (ICC: 1.00) and very good for automatic and semi-manual measurements, respectively. Ventricular volumes and functional indices except right ventricular ejection fraction obtained from the automatic method were significantly higher for the RV compared to the semi-manual methods. Conclusions: The automatic, whole-heart segmentation algorithm enabled highly reproducible global heart function to be rapidly obtained in patients undergoing gated whole-chest MDCT for assessment of acute chest pain with suspicion of pulmonary embolism.

  18. Automatic detection of pulmonary nodules in CT images by incorporating 3D tensor filtering with local image feature analysis.

    Science.gov (United States)

    Gong, Jing; Liu, Ji-Yu; Wang, Li-Jia; Sun, Xi-Wen; Zheng, Bin; Nie, Sheng-Dong

    2018-02-01

    Computer-aided detection (CAD) technology has been developed and demonstrated its potential to assist radiologists in detecting pulmonary nodules especially at an early stage. In this paper, we present a novel scheme for automatic detection of pulmonary nodules in CT images based on a 3D tensor filtering algorithm and local image feature analysis. We first apply a series of preprocessing steps to segment the lung volume and generate the isotropic volumetric CT data. Next, a unique 3D tensor filtering approach and local image feature analysis are used to detect nodule candidates. A 3D level set segmentation method is used to correct and refine the boundaries of nodule candidates subsequently. Then, we extract the features of the detected candidates and select the optimal features by using a CFS (Correlation Feature Selection) subset evaluator attribute selection method. Finally, a random forest classifier is trained to classify the detected candidates. The performance of this CAD scheme is validated using two datasets namely, the LUNA16 (Lung Nodule Analysis 2016) database and the ANODE09 (Automatic Nodule Detection 2009) database. By applying a 10-fold cross-validation method, the CAD scheme yielded a sensitivity of 79.3% at an average of 4 false positive detections per scan (FP/Scan) for the former dataset, and a sensitivity of 84.62% and 2.8 FP/Scan for the latter dataset, respectively. Our detection results show that the use of 3D tensor filtering algorithm combined with local image feature analysis constitutes an effective approach to detect pulmonary nodules. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Conversation analysis at work: detection of conflict in competitive discussions through semi-automatic turn-organization analysis.

    Science.gov (United States)

    Pesarin, Anna; Cristani, Marco; Murino, Vittorio; Vinciarelli, Alessandro

    2012-10-01

    This study proposes a semi-automatic approach aimed at detecting conflict in conversations. The approach is based on statistical techniques capable of identifying turn-organization regularities associated with conflict. The only manual step of the process is the segmentation of the conversations into turns (time intervals during which only one person talks) and overlapping speech segments (time intervals during which several persons talk at the same time). The rest of the process takes place automatically and the results show that conflictual exchanges can be detected with Precision and Recall around 70% (the experiments have been performed over 6 h of political debates). The approach brings two main benefits: the first is the possibility of analyzing potentially large amounts of conversational data with a limited effort, the second is that the model parameters provide indications on what turn-regularities are most likely to account for the presence of conflict.

  20. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis

    Science.gov (United States)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan

    2011-06-01

    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  1. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    Science.gov (United States)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  2. Detection of viable myocardium by dobutamine stress tagging magnetic resonance imaging with three-dimensional analysis by automatic trace method

    Energy Technology Data Exchange (ETDEWEB)

    Saito, Isao [Yatsu Hoken Hospital, Narashino, Chiba (Japan); Watanabe, Shigeru; Masuda, Yoshiaki

    2000-07-01

    The present study attempted to detect the viability of myocardium by quantitative automatic 3-dimensional analysis of the improvement of regional wall motion using an magnetic resonance imaging (MRI) tagging method. Twenty-two subjects with ischemic heart disease who had abnormal wall motion on echocardiography at rest were enrolled. All patients underwent dobutamine stress echocardiography (DSE), coronary arteriography and left ventriculography. The results were compared with those of 7 normal volunteers. MRI studies were done with myocardial tagging using the spatial modulation of magnetization technique. Automatic tracing with an original program was performed, and wall motion was compared before and during dobutamine infusion. The evaluation of myocardial viability with MRI and echocardiography had similar results in 19 (86.4%) of the 22 patients; 20 were studied by positron emission tomography or thallium-201 single photon emission computed tomography for myocardial viability, or studied for improvement of wall motion following coronary intervention. The sensitivity of dobutamine stress MRI (DSMRI) with tagging was 75.9% whereas that of DSE was 65.5%. The specificity of DSMRI was 85.7% (6/7) and that of DSE was 100% (7/7). The accuracy of DSMRI was 77.8% (28/36) and that of DSE 72.2% (26/36). DSMRI was shown to be superior to DSE in terms of evaluation of myocardial viability. (author)

  3. Automatic analysis and reduction of reaction mechanisms for complex fuel combustion

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Daniel

    2001-05-01

    This work concentrates on automatic procedures for simplifying chemical models for realistic fuels using skeletal mechanism construction and Quasi Steady-State Approximation (QSSA) applied to detailed reaction mechanisms. To automate the selection of species for removal or approximation, different indices for species ranking have thus been proposed. Reaction flow rates are combined with sensitivity information for targeting a certain quantity, and used to determine a level of redundancy for automatic skeletal mechanism construction by exclusion of redundant species. For QSSA reduction, a measure of species lifetime can be used for species ranking as-is, weighted by concentrations or molecular transport timescales, and/or combined with species sensitivity. Maximum values of the indices are accumulated over ranges of parameters, (e.g. fuel-air ratio and octane number), and species with low accumulated index values are selected for removal or steady-state approximation. In the case of QSSA, a model with a certain degree of reduction is automatically implemented as FORTRAN code by setting a certain index limit. The code calculates source terms of explicitly handled species from reaction rates and the steady-state concentrations by internal iteration. Homogeneous-reactor and one-dimensional laminar-flame models were used as test cases. A staged combustor fuelled by ethylene with monomethylamine addition is modelled by two homogeneous reactors in sequence, i.e. a PSR (Perfectly Stirred Reactor) followed by a PFR (Plug Flow Reactor). A modified PFR model was applied for simulation of a Homogeneous Charge Compression Ignition (HCCI) engine fuelled with four-component natural gas, whereas a two-zone model was required for a knocking Spark Ignition (SI) engine powered by Primary Reference Fuel (PRF). Finally, a laminar one-dimensional model was used to simulate premixed flames burning methane and an aeroturbine kerosene surrogate consisting of n-decane and toluene. In

  4. Automatic tracking and measurement of the motion of blood cells in microvessels based on analysis of multiple spatiotemporal images

    International Nuclear Information System (INIS)

    Chen, Yuan; Liu, Lei; Li, Hongjun; Zhao, Zhimin

    2011-01-01

    Automatic blood cell tracking and velocity measurement in microvessels is a crucial task in biomedical and physiological research. For the analysis of the motion of blood cells in microvessels, a commonly used method for blood cell tracking and velocity estimation is spatiotemporal image-based analysis. However, in the process of the spatiotemporal image generation, a single spatial path is used, i.e. the centreline, which is not suitable for many situations in which cells do not move strictly along the central axis of the microvessel. In this paper, we propose a new method for automatic tracking and measurement of the motion of blood cells in a microvessel based on multiple spatiotemporal images analysis. First, the proposed method adopts three spatial paths (the centreline, inner and outer contour of the microvessel) to generate three spatiotemporal images; then, the traces of blood cells in the spatiotemporal images are extracted and subsequently trace grouping and fusion processes are developed for tracking cell trajectories. For extracting traces in spatiotemporal images, a steerable filter is employed to enhance the traces in raw spatiotemporal images, and then the noise suppression function and orientation-filtering function are designed to extract trace candidates. In the subsequent grouping and fusion process, trace candidates are grouped by the proposed trace grouping rule, and then the trajectories are calculated by the proposed trace fusion approach. The results validate the proposed method for blood cell tracking and the accuracy for blood cell velocity measurement. Moreover, for the larger microvessels, we discuss the criterion of the number selection of the optimal spatial path by using both simulated and real experiments, and it can be used as the criterion for blood cell tracking in microvessels

  5. Rapid automatized naming (RAN) in children with ADHD: An ex-Gaussian analysis

    Science.gov (United States)

    Ryan, Matthew; Jacobson, Lisa A.; Hague, Cole; Bellows, Alison; Denckla, Martha B.; Mahone, E. Mark

    2016-01-01

    Children with ADHD demonstrate increased frequent “lapses” in performance on tasks in which the stimulus presentation rate is externally controlled, leading to increased variability in response times. It is less clear whether these lapses are also evident during performance on self-paced tasks, e.g., rapid automatized naming (RAN), or whether RAN inter-item pause time variability uniquely predicts reading performance. A total of 80 children aged 9 to 14 years—45 children with attention-deficit/hyperactivity disorder (ADHD) and 35 typically developing (TD) children—completed RAN and reading fluency measures. RAN responses were digitally recorded for analyses. Inter-stimulus pause time distributions (excluding between-row pauses) were analyzed using traditional (mean, standard deviation [SD], coefficient of variation [CV]) and ex-Gaussian (mu, sigma, tau) methods. Children with ADHD were found to be significantly slower than TD children (p reading fluency. RAN response time distributions were also significantly more variable (SD, tau) in children with ADHD. Hierarchical regression revealed that the exponential component (tau) of the letter-naming response time distribution uniquely predicted reading fluency in children with ADHD (p reading, ADHD symptom severity and age. The findings suggest that children with ADHD (without word-level reading difficulties) manifest slowed performance on tasks of reading fluency; however, this “slowing” may be due in part to lapses from ongoing performance that can be assessed directly using ex-Gaussian methods that capture excessively long response times. PMID:27108619

  6. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  7. Automatic parameterization and analysis of stellar atmospheres: a study of the DA white dwarfs

    International Nuclear Information System (INIS)

    McMahan, R.K. Jr.

    1986-01-01

    A method for automatically calculating atmospheric parameters of hydrogen-rich degenerate stars from low resolution spectra is advanced and then applied to the spectra of 53 DA white dwarfs. All data were taken using the Mark II spectrograph on the McGraw-Hill 1.3 m telescope and cover the spectral range λλ4100-7000 at a resolution of eight Angstroms. The model grid was generated at Dartmouth using the atmosphere code LUCIFER; it contained over 275 synthetic spectra extending from 6000 to 100,000 K in effective temperature and 7.4-9.3 in log g. A new value for the width of the DA mass distribution was achieved using the techniques presented here. Accuracies in the atmospheric parameters greater than twice those previously published were obtained. These results place strict constraints on the magnitude of mass loss in stars in the red giant phase, as well as in the mechanisms responsible for the loss

  8. SISSY: An efficient and automatic algorithm for the analysis of EEG sources based on structured sparsity.

    Science.gov (United States)

    Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I

    2017-08-15

    Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents

    Directory of Open Access Journals (Sweden)

    Adrián Colomer Granero

    2016-07-01

    Full Text Available This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG, electrocardiography (ECG, Galvanic Skin Response (GSR and respiration data were acquired while subjects were watching a thirty-minutes audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation. After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing.

  10. Automatic Approach to Morphological Classification of Galaxies With Analysis of Galaxy Populations in Clusters

    Science.gov (United States)

    Sultanova, Madina; Barkhouse, Wayne; Rude, Cody

    2018-01-01

    The classification of galaxies based on their morphology is a field in astrophysics that aims to understand galaxy formation and evolution based on their physical differences. Whether structural differences are due to internal factors or a result of local environment, the dominate mechanism that determines galaxy type needs to be robustly quantified in order to have a thorough grasp of the origin of the different types of galaxies. The main subject of my Ph.D. dissertation is to explore the use of computers to automatically classify and analyze large numbers of galaxies according to their morphology, and to analyze sub-samples of galaxies selected by type to understand galaxy formation in various environments. I have developed a computer code to classify galaxies by measuring five parameters from their images in FITS format. The code was trained and tested using visually classified SDSS galaxies from Galaxy Zoo and the EFIGI data set. I apply my morphology software to numerous galaxies from diverse data sets. Among the data analyzed are the 15 Abell galaxy clusters (0.03 computer software to classify and analyze the morphology of galaxies will be extremely important in terms of efficiency. This research aims to contribute to the solution of this problem.

  11. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol 1, Basic Concepts

    DEFF Research Database (Denmark)

    Jensen, Kurt

    volume contains the formal definition of CP-nets and the mathematical theory behind their analysis methods. It gives a detailed presentation of many small examples and a brief overview of some industrial applications. The purpose of the book is to teach the reader how to construct CP-net models......This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets. These CP-nets are shown to be a full-fledged language for the design, specification, simulation, validation and implementation of large software systems. The introductory first...

  12. Balance of the uranium market. Contribution to an economic analysis. Vol. 1

    International Nuclear Information System (INIS)

    Ehsani, J.

    1983-11-01

    The evolution of the economic and the energetic world situation for these last 40 years are first described stressing electricity production. Place of nuclear energy in the energy market is analyzed taking into account socio-political factors. In a second part, cost for electricity production from different sources: coal, nuclear power, fuel-oil are compared. Two models for uranium world supply and demand are developed. In the third part the uranium market balance, is examined. The analysis includes 3 steps: determination of total uranium demand from 1983 to 2000, determination of total uranium supply by portion of production cost, supply and demand are compared for future evolution with different hypothesis [fr

  13. Standard practice for determining the inclusion or second-phase constituent content of metals by automatic image analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2003-01-01

    1.1 This practice describes a procedure for obtaining stereological measurements that describe basic characteristics of the morphology of indigenous inclusions in steels and other metals using automatic image analysis. The practice can be applied to provide such data for any discrete second phase. Note 1—Stereological measurement methods are used in this practice to assess the average characteristics of inclusions or other second-phase particles on a longitudinal plane-of-polish. This information, by itself, does not produce a three-dimensional description of these constituents in space as deformation processes cause rotation and alignment of these constituents in a preferred manner. Development of such information requires measurements on three orthogonal planes and is beyond the scope of this practice. 1.2 This practice specifically addresses the problem of producing stereological data when the features of the constituents to be measured make attainment of statistically reliable data difficult. 1.3 Thi...

  14. Evaluation of needle trap micro-extraction and automatic alveolar sampling for point-of-care breath analysis.

    Science.gov (United States)

    Trefz, Phillip; Rösner, Lisa; Hein, Dietmar; Schubert, Jochen K; Miekisch, Wolfram

    2013-04-01

    Needle trap devices (NTDs) have shown many advantages such as improved detection limits, reduced sampling time and volume, improved stability, and reproducibility if compared with other techniques used in breath analysis such as solid-phase extraction and solid-phase micro-extraction. Effects of sampling flow (2-30 ml/min) and volume (10-100 ml) were investigated in dry gas standards containing hydrocarbons, aldehydes, and aromatic compounds and in humid breath samples. NTDs contained (single-bed) polymer packing and (triple-bed) combinations of divinylbenzene/Carbopack X/Carboxen 1000. Substances were desorbed from the NTDs by means of thermal expansion and analyzed by gas chromatography-mass spectrometry. An automated CO2-controlled sampling device for direct alveolar sampling at the point-of-care was developed and tested in pilot experiments. Adsorption efficiency for small volatile organic compounds decreased and breakthrough increased when sampling was done with polymer needles from a water-saturated matrix (breath) instead from dry gas. Humidity did not affect analysis with triple-bed NTDs. These NTDs showed only small dependencies on sampling flow and low breakthrough from 1-5 %. The new sampling device was able to control crucial parameters such as sampling flow and volume. With triple-bed NTDs, substance amounts increased linearly with increasing sample volume when alveolar breath was pre-concentrated automatically. When compared with manual sampling, automatic sampling showed comparable or better results. Thorough control of sampling and adequate choice of adsorption material is mandatory for application of needle trap micro-extraction in vivo. The new CO2-controlled sampling device allows direct alveolar sampling at the point-of-care without the need of any additional sampling, storage, or pre-concentration steps.

  15. ANALYSIS OF THE DISTANCES COVERED BY FIRST DIVISION BRAZILIAN SOCCER PLAYERS OBTAINED WITH AN AUTOMATIC TRACKING METHOD

    Directory of Open Access Journals (Sweden)

    Ricardo M. L. Barros

    2007-06-01

    Full Text Available Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil, the results of 55 outline players participated in the whole game (n = 55 are presented. The results of mean distances covered, standard deviations (s and coefficient of variation (cv after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m, central midfielders (10476 ± 702 m and external midfielders (10598 ± 890 m were greater than forwards (9612 ± 772 m and forwards covered greater distances than central defenders (9029 ± 860 m. The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6% highly significant greater (p < 0.001 than the mean value 4,808 m (s = 375 m, cv = 7.8% in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half

  16. Automatic Monitoring System Design and Failure Probability Analysis for River Dikes on Steep Channel

    Science.gov (United States)

    Chang, Yin-Lung; Lin, Yi-Jun; Tung, Yeou-Koung

    2017-04-01

    The purposes of this study includes: (1) design an automatic monitoring system for river dike; and (2) develop a framework which enables the determination of dike failure probabilities for various failure modes during a rainstorm. The historical dike failure data collected in this study indicate that most dikes in Taiwan collapsed under the 20-years return period discharge, which means the probability of dike failure is much higher than that of overtopping. We installed the dike monitoring system on the Chiu-She Dike which located on the middle stream of Dajia River, Taiwan. The system includes: (1) vertical distributed pore water pressure sensors in front of and behind the dike; (2) Time Domain Reflectometry (TDR) to measure the displacement of dike; (3) wireless floating device to measure the scouring depth at the toe of dike; and (4) water level gauge. The monitoring system recorded the variation of pore pressure inside the Chiu-She Dike and the scouring depth during Typhoon Megi. The recorded data showed that the highest groundwater level insides the dike occurred 15 hours after the peak discharge. We developed a framework which accounts for the uncertainties from return period discharge, Manning's n, scouring depth, soil cohesion, and friction angle and enables the determination of dike failure probabilities for various failure modes such as overtopping, surface erosion, mass failure, toe sliding and overturning. The framework was applied to Chiu-She, Feng-Chou, and Ke-Chuang Dikes on Dajia River. The results indicate that the toe sliding or overturning has the highest probability than other failure modes. Furthermore, the overall failure probability (integrate different failure modes) reaches 50% under 10-years return period flood which agrees with the historical failure data for the study reaches.

  17. A locally designed mobile laboratory for radiation analysis and monitoring in qatar. Vol. 4

    International Nuclear Information System (INIS)

    Abou-Leila, H.; El-Samman, H.; Mahmoud, H.

    1996-01-01

    A description of a mobile laboratory for radiation analysis and monitoring, completely designed in qatar and equipped at qatar university, is given. It consists of a van equipped with three scintillation detectors mounted on the front bumper. The detectors can monitor gamma radiations along the path of the laboratory over an angle range 120 degree. One Eberline radiation monitoring station is mounted on the roof. The laboratory is also equipped with several, and neutron survey meters in addition to some sampling equipment. All equipment used are powered with solar panels. The characteristics and performance of solar power/stabilized A C conversion is given. Data acquisition from the three scintillation detectors is performed by adding the outputs of the three detectors and storing the total as a function of time in a computer based multi-channel analyzer (MCA) operated in the MSC mode. The acquisition can be switched easily to the PHA mode to analyze gamma spectra from any possible contamination source. The laboratory was used in several environmental and possible contamination missions. Some results obtained during some of these missions are given. 4 figs

  18. Biodosimetry estimation using the ratio of the longest:shortest length in the premature chromosome condensation (PCC) method applying autocapture and automatic image analysis.

    Science.gov (United States)

    González, Jorge E; Romero, Ivonne; Gregoire, Eric; Martin, Cécile; Lamadrid, Ana I; Voisin, Philippe; Barquinero, Joan-Francesc; García, Omar

    2014-09-01

    The combination of automatic image acquisition and automatic image analysis of premature chromosome condensation (PCC) spreads was tested as a rapid biodosimeter protocol. Human peripheral lymphocytes were irradiated with (60)Co gamma rays in a single dose of between 1 and 20 Gy, stimulated with phytohaemaglutinin and incubated for 48 h, division blocked with Colcemid, and PCC-induced by Calyculin A. Images of chromosome spreads were captured and analysed automatically by combining the Metafer 4 and CellProfiler platforms. Automatic measurement of chromosome lengths allows the calculation of the length ratio (LR) of the longest and the shortest piece that can be used for dose estimation since this ratio is correlated with ionizing radiation dose. The LR of the longest and the shortest chromosome pieces showed the best goodness-of-fit to a linear model in the dose interval tested. The application of the automatic analysis increases the potential use of the PCC method for triage in the event of massive radiation causalities. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  19. Automatic knowledge extraction in sequencing analysis with multiagent system and grid computing.

    Science.gov (United States)

    González, Roberto; Zato, Carolina; Benito, Rocío; Bajo, Javier; Hernández, Jesús M; De Paz, Juan F; Vera, Vicente; Corchado, Juan M

    2012-12-01

    Advances in bioinformatics have contributed towards a significant increase in available information. Information analysis requires the use of distributed computing systems to best engage the process of data analysis. This study proposes a multiagent system that incorporates grid technology to facilitate distributed data analysis by dynamically incorporating the roles associated to each specific case study. The system was applied to genetic sequencing data to extract relevant information about insertions, deletions or polymorphisms.

  20. Automatic knowledge extraction in sequencing analysis with multiagent system and grid computing

    Directory of Open Access Journals (Sweden)

    González Roberto

    2012-12-01

    Full Text Available Advances in bioinformatics have contributed towards a significant increase in available information. Information analysis requires the use of distributed computing systems to best engage the process of data analysis. This study proposes a multiagent system that incorporates grid technology to facilitate distributed data analysis by dynamically incorporating the roles associated to each specific case study. The system was applied to genetic sequencing data to extract relevant information about insertions, deletions or polymorphisms.

  1. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Y [National Cheng Kung University, Tainan, Taiwan (China); Huang, H [Chang Gung University, Taoyuan, Taiwan (China); Su, T [Chang Gung Memorial Hospital, Taoyuan, Taiwan (China)

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  2. Automatic tool for the analysis of the constancy of image of the radiodiagnostic equipment

    International Nuclear Information System (INIS)

    Verdu Martin, G.; Mayo Nogueira, P.; Rodenas Escriba, F.; Martin, B.; Campayo Esteban, J. M.

    2010-01-01

    In this work we have developed a graphical interface applied to obtain radiographic equipment with digital imaging. which are replacing conventional analog image. This increased use of digital technology device requires a study of the analysis of the image quality obtained in this type of equipment, such analysis is necessary to ensure adequate images that allow a correct diagnosis.

  3. Clarifying Inconclusive Functional Analysis Results: Assessment and Treatment of Automatically Reinforced Aggression

    Science.gov (United States)

    Saini, Valdeep; Greer, Brian D.; Fisher, Wayne W.

    2016-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy’s functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences maintaining aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation. PMID:25891269

  4. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    Science.gov (United States)

    2017-01-01

    Atmospheric Administration (NOAA) tides and currents applications program interface ( API ): http://tidesandcurrents.noaa.gov/ api /. AIS data AIS...files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...McKinney, W. 2012. Python for data analysis. Sebastopol, CA: O’Reilly Media, Inc. Mitchell, K. N. April. 2012. A review of coastal navigation asset

  5. Analysis of DGNB-DK criteria for BIM-based Model Checking automatization

    DEFF Research Database (Denmark)

    Gade, Peter Nørkjær; Svidt, Kjeld; Jensen, Rasmus Lund

    This report includes the results of an analysis of the automation potential of the Danish edition of building sustainability assessment method Deutsche Gesellschaft für Nachhaltiges Bauen (DGNB) for office buildings version 2014 1.1. The analysis investigate the criteria related to DGNB-DK and if......-DK and if they would be suited for automation through the technological concept BIM-based Model Checking (BMC)....

  6. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    Science.gov (United States)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  7. Neutron activation analysis of regolith delivered by automatic station ''Luna-20''

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Kirnozov, F.F.; Kolesov, G.M.; Ryvkin, B.N.; Surkov, Yu.A.; Shpanov, A.P.

    1975-01-01

    The aim of the investigation was to establish the difference in Moon soil samples - the ''continental'' and the ''marine'' regoliths - on the basis of the content therein of a number of chemical elements. Recourse was made to a most sensitive method of chemical analysis, the neutron activation analysis. Data on the content of chemical elements in the samples were tabulated, and on the whole they agreed with preliminary data obtained in other investigations, but there tended to be differences in the content of Pr, Gd, Tm. The above data characterized the moon continental regolith as an anortisizite rock, whereas the marine regolith being basically a basalt rock

  8. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    Science.gov (United States)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  9. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China.

    Science.gov (United States)

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-05-20

    In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.

  10. Parameter design and performance analysis of shift actuator for a two-speed automatic mechanical transmission for pure electric vehicles

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2016-08-01

    Full Text Available Recent developments of pure electric vehicles have shown that pure electric vehicles equipped with two-speed or multi-speed gearbox possess higher energy efficiency by ensuring the drive motor operates at its peak performance range. This article presents the design, analysis, and control of a two-speed automatic mechanical transmission for pure electric vehicles. The shift actuator is based on a motor-controlled camshaft where a special geometric groove is machined, and the camshaft realizes the axial positions of the synchronizer sleeve for gear engaging, disengaging, and speed control of the drive motor. Based on the force analysis of shift process, the parameters of shift actuator and shift motor are designed. The drive motor’s torque control strategy before shifting, speed governing control strategy before engaging, shift actuator’s control strategy during gear engaging, and drive motor’s torque recovery strategy after shift process are proposed and implemented with a prototype. To validate the performance of the two-speed gearbox, a test bed was developed based on dSPACE that emulates various operation conditions. The experimental results indicate that the shift process with the proposed shift actuator and control strategy could be accomplished within 1 s under various operation conditions, with shift smoothness up to passenger car standard.

  11. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    Science.gov (United States)

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  12. I-AbACUS: a Reliable Software Tool for the Semi-Automatic Analysis of Invasion and Migration Transwell Assays.

    Science.gov (United States)

    Cortesi, Marilisa; Llamosas, Estelle; Henry, Claire E; Kumaran, Raani-Yogeeta A; Ng, Benedict; Youkhana, Janet; Ford, Caroline E

    2018-02-28

    The quantification of invasion and migration is an important aspect of cancer research, used both in the study of the molecular processes involved in this collection of diseases and the evaluation of the efficacy of new potential treatments. The transwell assay, while being one of the most widely used techniques for the evaluation of these characteristics, shows a high dependence on the operator's ability to correctly identify the cells and a low protocol standardization. Here we present I-AbACUS, a software tool specifically designed to aid the analysis of transwell assays that automatically and specifically recognizes cells in images of stained membranes and provides the user with a suggested cell count. A complete description of this instrument, together with its validation against the standard analysis technique for this assay is presented. Furthermore, we show that I-AbACUS is versatile and able to elaborate images containing cells with different morphologies and that the obtained results are less dependent on the operator and their experience. We anticipate that this instrument, freely available (Gnu Public Licence GPL v2) at www.marilisacortesi.com as a standalone application, could significantly improve the quantification of invasion and migration of cancer cells.

  13. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    Science.gov (United States)

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  14. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    International Nuclear Information System (INIS)

    Wei, J; Yuan, A; Li, G

    2014-01-01

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  15. Automatic quantitative computed tomography segmentation and analysis of aerated lung volumes in acute respiratory distress syndrome-A comparative diagnostic study.

    Science.gov (United States)

    Klapsing, Philipp; Herrmann, Peter; Quintel, Michael; Moerer, Onnen

    2017-12-01

    Quantitative lung computed tomographic (CT) analysis yields objective data regarding lung aeration but is currently not used in clinical routine primarily because of the labor-intensive process of manual CT segmentation. Automatic lung segmentation could help to shorten processing times significantly. In this study, we assessed bias and precision of lung CT analysis using automatic segmentation compared with manual segmentation. In this monocentric clinical study, 10 mechanically ventilated patients with mild to moderate acute respiratory distress syndrome were included who had received lung CT scans at 5- and 45-mbar airway pressure during a prior study. Lung segmentations were performed both automatically using a computerized algorithm and manually. Automatic segmentation yielded similar lung volumes compared with manual segmentation with clinically minor differences both at 5 and 45 mbar. At 5 mbar, results were as follows: overdistended lung 49.58mL (manual, SD 77.37mL) and 50.41mL (automatic, SD 77.3mL), P=.028; normally aerated lung 2142.17mL (manual, SD 1131.48mL) and 2156.68mL (automatic, SD 1134.53mL), P = .1038; and poorly aerated lung 631.68mL (manual, SD 196.76mL) and 646.32mL (automatic, SD 169.63mL), P = .3794. At 45 mbar, values were as follows: overdistended lung 612.85mL (manual, SD 449.55mL) and 615.49mL (automatic, SD 451.03mL), P=.078; normally aerated lung 3890.12mL (manual, SD 1134.14mL) and 3907.65mL (automatic, SD 1133.62mL), P = .027; and poorly aerated lung 413.35mL (manual, SD 57.66mL) and 469.58mL (automatic, SD 70.14mL), P=.007. Bland-Altman analyses revealed the following mean biases and limits of agreement at 5 mbar for automatic vs manual segmentation: overdistended lung +0.848mL (±2.062mL), normally aerated +14.51mL (±49.71mL), and poorly aerated +14.64mL (±98.16mL). At 45 mbar, results were as follows: overdistended +2.639mL (±8.231mL), normally aerated 17.53mL (±41.41mL), and poorly aerated 56.23mL (±100.67mL). Automatic

  16. Automatic analysis of quality of images from X-ray digital flat detectors

    International Nuclear Information System (INIS)

    Le Meur, Y.

    2009-04-01

    Since last decade, medical imaging has grown up with the development of new digital imaging techniques. In the field of X-ray radiography, new detectors replace progressively older techniques, based on film or x-ray intensifiers. These digital detectors offer a higher sensibility and reduced overall dimensions. This work has been prepared with Trixell, the world leading company in flat detectors for medical radiography. It deals with quality control on digital images stemming from these detectors. High quality standards of medical imaging impose a close analysis of the defects that can appear on the images. This work describes a complete process for quality analysis of such images. A particular focus is given on the detection task of the defects, thanks to methods well adapted to our context of spatially correlated defects in noise background. (author)

  17. Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness

    Science.gov (United States)

    2016-06-01

    Technology UML Unified Modeling Language xi THIS PAGE INTENTIONALLY LEFT BLANK xii Acknowledgments There are many people without whom I would not have been...3 Cryptographic Attack Tree Analysis Model 7 3.1 Attack Trees . . . . . . . . . . . . . . . . . . . . . . . . . . 7 3.2 Attack Trees with Reductions...faith; if it is serving, then serve; if it is teaching , then teach .” I firmly believe Dr. Gondree’s grace-given gift is teaching . Throughout this

  18. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    OpenAIRE

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    Objective: To develop a method for objective quantification of PD motor symptoms related to Off episodes and peak dose dyskinesias, using spiral data gathered by using a touch screen telemetry device. The aim was to objectively characterize predominant motor phenotypes (bradykinesia and dyskinesia), to help in automating the process of visual interpretation of movement anomalies in spirals as rated by movement disorder specialists. Background: A retrospective analysis was conducted on recordi...

  19. Analysis of cannabis in oral fluid specimens by GC-MS with automatic SPE.

    Science.gov (United States)

    Choi, Hyeyoung; Baeck, Seungkyung; Kim, Eunmi; Lee, Sooyeun; Jang, Moonhee; Lee, Juseon; Choi, Hwakyung; Chung, Heesun

    2009-12-01

    Methamphetamine (MA) is the most commonly abused drug in Korea, followed by cannabis. Traditionally, MA analysis is carried out on both urine and hair samples and cannabis analysis in urine samples only. Despite the fact that oral fluid has become increasingly popular as an alternative specimen in the field of driving under the influence of drugs (DUID) and work place drug testing, its application has not been expanded to drug analysis in Korea. Oral fluid is easy to collect and handle and can provide an indication of recent drug abuse. In this study, we present an analytical method using GC-MS to determine tetrahydrocannabinol (THC) and its main metabolite 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in oral fluid. The validated method was applied to oral fluid samples collected from drug abuse suspects and the results were compared with those in urine. The stability of THC and THC-COOH in oral fluid stored in different containers was also investigated. Oral fluid specimens from 12 drug abuse suspects, submitted by the police, were collected by direct expectoration. The samples were screened with microplate ELISA. For confirmation they were extracted using automated SPE with mixed-mode cation exchange cartridge, derivatized and analyzed by GC-MS using selective ion monitoring (SIM). The concentrations ofTHC and THC-COOH in oral fluid showed a large variation and the results from oral fluid and urine samples from cannabis abusers did not show any correlation. Thus, detailed information about time interval between drug use and sample collection is needed to interpret the oral fluid results properly. In addition, further investigation about the detection time window ofTHC and THC-COOH in oral fluid is required to substitute oral fluid for urine in drug testing.

  20. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    A microgrid is a cluster of power generation, consumption and storage systems capable of operating either independently or as part of a macrogrid. The mechanical condition of the power production units, such as the small wind turbines, is considered of crucial importance especially in the case...... of islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...

  1. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    Science.gov (United States)

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection

  2. Developing Automatic Form and Design System Using Integrated Grey Relational Analysis and Affective Engineering

    Directory of Open Access Journals (Sweden)

    Chen-Yuan Liu

    2018-01-01

    Full Text Available In the modern highly competitive marketplace and global market environment, product quality improvements that abridge development time and reduce the production costs are effective methods for promoting the business competitiveness of a product in shorter lifecycles. Since the design process is the best time to control such parameters, systematically designing the processes to develop a product that more closely fits the demand requirements for the market is a key factor for developing a successful product. In this paper, a combined affective engineering method and grey relational analysis are used to develop a product design process. First, design image scale technology is used to acquire the best the design criteria factors, and then affective engineering methods are used to set the relationships between customer needs and production factors. Finally, grey relational analysis is used to select the optimal design strategy. Using this systematic design method, a higher quality product can be expanded upon in a shorter lead-time for improving business competition.

  3. Elemental analysis of soil and plant samples at El-Manzala lake neutron activation analysis technique. Vol. 4

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; Abd El-Haleem, A.S.; El-Abbady, W.H.

    1996-01-01

    Soil and plant samples were collected from from two locations, Bahr El-Baker, and Bahr kados at the manzala lake, where, where high pollution is expected. The samples were especially treated and prepared for investigation by thermal neutron activation analysis (NAA). The irradiation facilities of the first egyptian research reactor (ET-R R-1), and the hyper pure germanium (HPGe) detection system were used for analysis. Among the 34 identified Fe, Co, As, Cd, Te, La, Sm, Rb, Hg, Th, and U are of a special significance because of the their toxic deleterious impact on living organisms. This work is part of a research project concerning pollution studies on the river Nile and some lakes of egypt. The data obtained in the present work stands as a reference basic record for any future follow up of contamination level. 1 tab

  4. Stiffness analysis of spring mechanism for semi automatic gripper motion of tendon driven remote manipulator

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang

    2012-01-01

    Remote handling manipulators are widely used for performing hazardous tasks, and it is essential to ensure the reliable performance of such systems. Toward this end, tendon driven mechanisms are adopted in such systems to reduce the weight of the distal parts of the manipulator while maintaining the handling performance. In this study, several approaches for the design of a gripper system for a tendon driven remote handling system are introduced. Basically, this gripper has an underactuated spring mechanism that is combined with a slave manipulator triggered by a master operator. Based on the requirements under the specified tendon driven mechanism, the connecting position of the spring system on the gripper mechanism and kinematic influence coefficient (KIC) analysis are performed. As a result, a suitable combination of components for the proper design of the target system is presented and verified

  5. Automatic classification of the interferential tear film lipid layer using colour texture analysis.

    Science.gov (United States)

    Remeseiro, B; Penas, M; Barreira, N; Mosquera, A; Novo, J; García-Resúa, C

    2013-07-01

    The tear film lipid layer is heterogeneous among the population. Its classification depends on its thickness and can be done using the interference pattern categories proposed by Guillon. This papers presents an exhaustive study about the characterisation of the interference phenomena as a texture pattern, using different feature extraction methods in different colour spaces. These methods are first analysed individually and then combined to achieve the best results possible. The principal component analysis (PCA) technique has also been tested to reduce the dimensionality of the feature vectors. The proposed methodologies have been tested on a dataset composed of 105 images from healthy subjects, with a classification rate of over 95% in some cases. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Automatic schizophrenic discrimination on fNIRS by using complex brain network analysis and SVM.

    Science.gov (United States)

    Song, Hong; Chen, Lei; Gao, RuiQi; Bogdan, Iordachescu Ilie Mihaita; Yang, Jian; Wang, Shuliang; Dong, Wentian; Quan, Wenxiang; Dang, Weimin; Yu, Xin

    2017-12-20

    Schizophrenia is a kind of serious mental illness. Due to the lack of an objective physiological data supporting and a unified data analysis method, doctors can only rely on the subjective experience of the data to distinguish normal people and patients, which easily lead to misdiagnosis. In recent years, functional Near-Infrared Spectroscopy (fNIRS) has been widely used in clinical diagnosis, it can get the hemoglobin concentration through the variation of optical intensity. Firstly, the prefrontal brain networks were constructed based on oxy-Hb signals from 52-channel fNIRS data of schizophrenia and healthy controls. Then, Complex Brain Network Analysis (CBNA) was used to extract features from the prefrontal brain networks. Finally, a classier based on Support Vector Machine (SVM) is designed and trained to discriminate schizophrenia from healthy controls. We recruited a sample which contains 34 healthy controls and 42 schizophrenia patients to do the one-back memory task. The hemoglobin response was measured in the prefrontal cortex during the task using a 52-channel fNIRS system. The experimental results indicate that the proposed method can achieve a satisfactory classification with the accuracy of 85.5%, 92.8% for schizophrenia samples and 76.5% for healthy controls. Also, our results suggested that fNIRS has the potential capacity to be an effective objective biomarker for the diagnosis of schizophrenia. Our results suggested that, using the appropriate classification method, fNIRS has the potential capacity to be an effective objective biomarker for the diagnosis of schizophrenia.

  7. Suitability of UK Biobank Retinal Images for Automatic Analysis of Morphometric Properties of the Vasculature.

    Science.gov (United States)

    MacGillivray, Thomas J; Cameron, James R; Zhang, Qiuli; El-Medany, Ahmed; Mulholland, Carl; Sheng, Ziyan; Dhillon, Bal; Doubal, Fergus N; Foster, Paul J; Trucco, Emmanuel; Sudlow, Cathie

    2015-01-01

    To assess the suitability of retinal images held in the UK Biobank--the largest retinal data repository in a prospective population-based cohort--for computer assisted vascular morphometry, generating measures that are commonly investigated as candidate biomarkers of systemic disease. Non-mydriatic fundus images from both eyes of 2,690 participants--people with a self-reported history of myocardial infarction (n=1,345) and a matched control group (n=1,345)--were analysed using VAMPIRE software. These images were drawn from those of 68,554 UK Biobank participants who underwent retinal imaging at recruitment. Four operators were trained in the use of the software to measure retinal vascular tortuosity and bifurcation geometry. Total operator time was approximately 360 hours (4 minutes per image). 2,252 (84%) of participants had at least one image of sufficient quality for the software to process, i.e. there was sufficient detection of retinal vessels in the image by the software to attempt the measurement of the target parameters. 1,604 (60%) of participants had an image of at least one eye that was adequately analysed by the software, i.e. the measurement protocol was successfully completed. Increasing age was associated with a reduced proportion of images that could be processed (p=0.0004) and analysed (pBiobank are of insufficient quality for automated analysis. However, the large size of the UK Biobank means that tens of thousands of images are available and suitable for computational analysis. Parametric information measured from the retinas of participants with suspected cardiovascular disease was significantly different to that measured from a matched control group.

  8. An automatic flow injection analysis procedure for photometric determination of ethanol in red wine without using a chromogenic reagent.

    Science.gov (United States)

    Borges, Sivanildo S; Frizzarin, Rejane M; Reis, Boaventura F

    2006-05-01

    An automatic reagentless photometric procedure for the determination of ethanol in red wine is described. The procedure was based on a falling drop system that was implemented by employing a flow injection analysis manifold. The detection system comprised an infrared LED and a phototransistor. The experimental arrangement was designed to ensure that the wine drop grew between these devices, thus causing a decrease in the intensity of the radiation beam coming from the LED. Since ethanol content affected the size of the wine drop this feature was exploited to develop an analytical procedure for the photometric determination of ethanol in red wine without using a chromogenic reagent. In an attempt to prove the usefulness of the proposed procedure, a set of red wines were analysed. No significant difference between our results and those obtained with a reference method was observed at the 95% confidence level. Other advantages of our method were a linear response ranging from 0.17 up to 5.14 mol L(-1) (1.0 up to 30.0%) ethanol (R = 0.999); a limit of detection of 0.05 mol L(-1) (0.3%) ethanol; a relative standard deviation of 2.5% (n = 10) using typical wine sample containing 2.14 mol L(-1) (12.5%) ethanol; and a sampling rate of 50 determinations per hour.

  9. Automatic CT-based finite element model generation for temperature-based death time estimation: feasibility study and sensitivity analysis.

    Science.gov (United States)

    Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Erdmann, Bodo; Weiser, Martin; Zachow, Stefan; Heinrich, Andreas; Güttler, Felix Victor; Teichgräber, Ulf; Mall, Gita

    2017-05-01

    Temperature-based death time estimation is based either on simple phenomenological models of corpse cooling or on detailed physical heat transfer models. The latter are much more complex but allow a higher accuracy of death time estimation, as in principle, all relevant cooling mechanisms can be taken into account.Here, a complete workflow for finite element-based cooling simulation is presented. The following steps are demonstrated on a CT phantom: Computer tomography (CT) scan Segmentation of the CT images for thermodynamically relevant features of individual geometries and compilation in a geometric computer-aided design (CAD) model Conversion of the segmentation result into a finite element (FE) simulation model Computation of the model cooling curve (MOD) Calculation of the cooling time (CTE) For the first time in FE-based cooling time estimation, the steps from the CT image over segmentation to FE model generation are performed semi-automatically. The cooling time calculation results are compared to cooling measurements performed on the phantoms under controlled conditions. In this context, the method is validated using a CT phantom. Some of the phantoms' thermodynamic material parameters had to be determined via independent experiments.Moreover, the impact of geometry and material parameter uncertainties on the estimated cooling time is investigated by a sensitivity analysis.

  10. Automatic spot preparation and image processing of paper microzone-based assays for analysis of bioactive compounds in plant extracts.

    Science.gov (United States)

    Vaher, M; Borissova, M; Seiman, A; Aid, T; Kolde, H; Kazarjan, J; Kaljurand, M

    2014-01-15

    The colorimetric determination of the concentration of phytochemicals in plant extract samples using a spotting automatic system, mobile phone camera and a computer with developed software for quantification is described. Method automation was achieved by using a robotic system for spotting. The instrument was set to disperse the appropriate aliquots of the reagents and sample on a Whatman paper sheet. Spots were photographed and analysed by ImageJ software or by applying the developed MatLab based algorithm. The developed assay was found to be effective, with a linear response at the concentration range of 0.03-0.25g/L for polyphenols. The detection limit of the proposed method is sub 0.03g/L. The paper microzone-based assays for flavonoids and amino acids/peptides were also developed and evaluated as applicable. Comparing the results with conventional PμZP methods demonstrates that both methods yield similar results. At the same time, the proposed method has an attractive advantage in analysis time and repeatability/reproducibility. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Music and speech in early development: automatic analysis and classification of prosodic features from two Portuguese variants

    Directory of Open Access Journals (Sweden)

    Inês Salselas

    2011-06-01

    Full Text Available In the present study we aim to capture rhythmic and melodic patterning in speech and singing directed to infants. We address this issue by exploring the acoustic features that best predict different classification problems. We built a database composed by infant-directed speech from two Portuguese variants (European vs Brazilian Portuguese and infant-directed singing from the two cultures, comprising 977 tokens. Machine learning experiments were conducted in order to automatically discriminate between language variants for speech, vocal songs and between interaction contexts. Descriptors related with rhythm exhibited strong predictive ability for both speech and singing language variants’ discrimination tasks, presenting different rhythmic patterning for each variant. Common features could be used by a classifier to discriminate speech and singing, indicating that the processing of speech and singing may share the analysis of the same stimulus properties. With respect to discriminating interaction contexts, pitch-related descriptors showed better performance. We conclude that prosodic cues present in the surrounding sonic environment of an infant are rich sources of information not only to make distinctions between different communicative contexts through melodic cues, but also to provide specific cues about the rhythmic identity of their mother tongue. These prosodic differences may lead to further research on their influence in the development of the infant’s musical representations.

  12. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    Science.gov (United States)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  13. Automatic moment segmentation and peak detection analysis of heart sound pattern via short-time modified Hilbert transform.

    Science.gov (United States)

    Sun, Shuping; Jiang, Zhongwei; Wang, Haibin; Fang, Yu

    2014-05-01

    This paper proposes a novel automatic method for the moment segmentation and peak detection analysis of heart sound (HS) pattern, with special attention to the characteristics of the envelopes of HS and considering the properties of the Hilbert transform (HT). The moment segmentation and peak location are accomplished in two steps. First, by applying the Viola integral waveform method in the time domain, the envelope (E(T)) of the HS signal is obtained with an emphasis on the first heart sound (S1) and the second heart sound (S2). Then, based on the characteristics of the E(T) and the properties of the HT of the convex and concave functions, a novel method, the short-time modified Hilbert transform (STMHT), is proposed to automatically locate the moment segmentation and peak points for the HS by the zero crossing points of the STMHT. A fast algorithm for calculating the STMHT of E(T) can be expressed by multiplying the E(T) by an equivalent window (W(E)). According to the range of heart beats and based on the numerical experiments and the important parameters of the STMHT, a moving window width of N=1s is validated for locating the moment segmentation and peak points for HS. The proposed moment segmentation and peak location procedure method is validated by sounds from Michigan HS database and sounds from clinical heart diseases, such as a ventricular septal defect (VSD), an aortic septal defect (ASD), Tetralogy of Fallot (TOF), rheumatic heart disease (RHD), and so on. As a result, for the sounds where S2 can be separated from S1, the average accuracies achieved for the peak of S1 (AP₁), the peak of S2 (AP₂), the moment segmentation points from S1 to S2 (AT₁₂) and the cardiac cycle (ACC) are 98.53%, 98.31% and 98.36% and 97.37%, respectively. For the sounds where S1 cannot be separated from S2, the average accuracies achieved for the peak of S1 and S2 (AP₁₂) and the cardiac cycle ACC are 100% and 96.69%. Copyright © 2014 Elsevier Ireland Ltd. All

  14. AVID: Automatic Visualization Interface Designer

    National Research Council Canada - National Science Library

    Chuah, Mei

    2000-01-01

    .... Automatic generation offers great flexibility in performing data and information analysis tasks, because new designs are generated on a case by case basis to suit current and changing future needs...

  15. Automatic analysis of online image data for law enforcement agencies by concept detection and instance search

    Science.gov (United States)

    de Boer, Maaike H. T.; Bouma, Henri; Kruithof, Maarten C.; ter Haar, Frank B.; Fischer, Noëlle M.; Hagendoorn, Laurens K.; Joosten, Bart; Raaijmakers, Stephan

    2017-10-01

    The information available on-line and off-line, from open as well as from private sources, is growing at an exponential rate and places an increasing demand on the limited resources of Law Enforcement Agencies (LEAs). The absence of appropriate tools and techniques to collect, process, and analyze the volumes of complex and heterogeneous data has created a severe information overload. If a solution is not found, the impact on law enforcement will be dramatic, e.g. because important evidence is missed or the investigation time is too long. Furthermore, there is an uneven level of capabilities to deal with the large volumes of complex and heterogeneous data that come from multiple open and private sources at national level across the EU, which hinders cooperation and information sharing. Consequently, there is a pertinent need to develop tools, systems and processes which expedite online investigations. In this paper, we describe a suite of analysis tools to identify and localize generic concepts, instances of objects and logos in images, which constitutes a significant portion of everyday law enforcement data. We describe how incremental learning based on only a few examples and large-scale indexing are addressed in both concept detection and instance search. Our search technology allows querying of the database by visual examples and by keywords. Our tools are packaged in a Docker container to guarantee easy deployment on a system and our tools exploit possibilities provided by open source toolboxes, contributing to the technical autonomy of LEAs.

  16. Comparative Analysis between LDR and HDR Images for Automatic Fruit Recognition and Counting

    Directory of Open Access Journals (Sweden)

    Tatiana M. Pinho

    2017-01-01

    Full Text Available Precision agriculture is gaining an increasing interest in the current farming paradigm. This new production concept relies on the use of information technology (IT to provide a control and supervising structure that can lead to better management policies. In this framework, imaging techniques that provide visual information over the farming area play an important role in production status monitoring. As such, accurate representation of the gathered production images is a major concern, especially if those images are used in detection and classification tasks. Real scenes, observed in natural environment, present high dynamic ranges that cannot be represented by the common LDR (Low Dynamic Range devices. However, this issue can be handled by High Dynamic Range (HDR images since they have the ability to store luminance information similarly to the human visual system. In order to prove their advantage in image processing, a comparative analysis between LDR and HDR images, for fruits detection and counting, was carried out. The obtained results show that the use of HDR images improves the detection performance to more than 30% when compared to LDR.

  17. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  18. Elucidating the genotype–phenotype map by automatic enumeration and analysis of the phenotypic repertoire

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2015-01-01

    Background: The gap between genotype and phenotype is filled by complex biochemical systems most of which are poorly understood. Because these systems are complex, it is widely appreciated that quantitative understanding can only be achieved with the aid of mathematical models. However, formulating models and measuring or estimating their numerous rate constants and binding constants is daunting. Here we present a strategy for automating difficult aspects of the process. Methods: The strategy, based on a system design space methodology, is applied to a class of 16 designs for a synthetic gene oscillator that includes seven designs previously formulated on the basis of experimentally measured and estimated parameters. Results: Our strategy provides four important innovations by automating: (1) enumeration of the repertoire of qualitatively distinct phenotypes for a system; (2) generation of parameter values for any particular phenotype; (3) simultaneous realization of parameter values for several phenotypes to aid visualization of transitions from one phenotype to another, in critical cases from functional to dysfunctional; and (4) identification of ensembles of phenotypes whose expression can be phased to achieve a specific sequence of functions for rationally engineering synthetic constructs. Our strategy, applied to the 16 designs, reproduced previous results and identified two additional designs capable of sustained oscillations that were previously missed. Conclusions: Starting with a system’s relatively fixed aspects, its architectural features, our method enables automated analysis of nonlinear biochemical systems from a global perspective, without first specifying parameter values. The examples presented demonstrate the efficiency and power of this automated strategy. PMID:26998346

  19. Elucidating the genotype-phenotype map by automatic enumeration and analysis of the phenotypic repertoire.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    The gap between genotype and phenotype is filled by complex biochemical systems most of which are poorly understood. Because these systems are complex, it is widely appreciated that quantitative understanding can only be achieved with the aid of mathematical models. However, formulating models and measuring or estimating their numerous rate constants and binding constants is daunting. Here we present a strategy for automating difficult aspects of the process. The strategy, based on a system design space methodology, is applied to a class of 16 designs for a synthetic gene oscillator that includes seven designs previously formulated on the basis of experimentally measured and estimated parameters. Our strategy provides four important innovations by automating: (1) enumeration of the repertoire of qualitatively distinct phenotypes for a system; (2) generation of parameter values for any particular phenotype; (3) simultaneous realization of parameter values for several phenotypes to aid visualization of transitions from one phenotype to another, in critical cases from functional to dysfunctional; and (4) identification of ensembles of phenotypes whose expression can be phased to achieve a specific sequence of functions for rationally engineering synthetic constructs. Our strategy, applied to the 16 designs, reproduced previous results and identified two additional designs capable of sustained oscillations that were previously missed. Starting with a system's relatively fixed aspects, its architectural features, our method enables automated analysis of nonlinear biochemical systems from a global perspective, without first specifying parameter values. The examples presented demonstrate the efficiency and power of this automated strategy.

  20. Automatic CTF correction for single particles based upon multivariate statistical analysis of individual power spectra.

    Science.gov (United States)

    Sander, B; Golas, M M; Stark, H

    2003-06-01

    Three-dimensional electron cryomicroscopy of randomly oriented single particles is a method that is suitable for the determination of three-dimensional structures of macromolecular complexes at molecular resolution. However, the electron-microscopical projection images are modulated by a contrast transfer function (CTF) that prevents the calculation of three-dimensional reconstructions of biological complexes at high resolution from uncorrected images. We describe here an automated method for the accurate determination and correction of the CTF parameters defocus, twofold astigmatism and amplitude-contrast proportion from single-particle images. At the same time, the method allows the frequency-dependent signal decrease (B factor) and the non-convoluted background signal to be estimated. The method involves the classification of the power spectra of single-particle images into groups with similar CTF parameters; this is done by multivariate statistical analysis (MSA) and hierarchically ascending classification (HAC). Averaging over several power spectra generates class averages with enhanced signal-to-noise ratios. The correct CTF parameters can be deduced from these class averages by applying an iterative correlation procedure with theoretical CTF functions; they are then used to correct the raw images. Furthermore, the method enables the tilt axis of the sample holder to be determined and allows the elimination of individual poor-quality images that show high drift or charging effects.

  1. Meta-server for automatic analysis, scoring and ranking of docking models.

    Science.gov (United States)

    Anashkina, Anastasia A; Kravatsky, Yuri; Kuznetsov, Eugene; Makarov, Alexander A; Adzhubei, Alexei A

    2017-09-18

    Modelling with multiple servers that use different algorithms for docking results in more reliable predictions of interaction sites. However, the scoring and comparison of all models by an expert is time-consuming and is not feasible for large volumes of data generated by such modelling. QASDOM Server (Quality ASsessment of DOcking Models) is a simple and efficient tool for real-time simultaneous analysis, scoring and ranking of datasets of receptor-ligand complexes built by a range of docking techniques. This meta-server is designed to analyse large datasets of docking models and rank them by scoring criteria developed in this study. It produces two types of output showing the likelihood of specific residues and clusters of residues to be involved in receptor-ligand interactions, and the ranking of models. The server also allows visualising residues that form interaction sites in the receptor and ligand sequence, and displays three-dimensional model structures of the receptor-ligand complexes. http://qasdom.eimb.ru. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  2. Automatic analysis of multichannel time series data applied to MHD fluctuations

    International Nuclear Information System (INIS)

    Pretty, D.G.; Blackwell, B.D.; Detering, F.; Howard, J.; Oliver, D.; Hegland, M.; Hole, M.J.; Harris, J.H.

    2008-01-01

    We present a data mining technique for the analysis of multichannel oscillatory timeseries data and show an application using poloidal arrays of magnetic sensors installed in the H-1 heliac. The procedure is highly automated, and scales well to large datasets. In a preprocessing step, the timeseries data is split into short time segments to provide time resolution, and each segment is represented by a singular value decomposition (SVD). By comparing power spectra of the temporal singular vectors, singular values are grouped into subsets which define fluctuation structures. Thresholds for the normalised energy of the fluctuation structure and the normalised entropy of the SVD are used to filter the dataset. We assume that distinct classes of fluctuations are localised in the space of phase differences (n, n+1) between each pair of nearest neighbour channels. An expectation maximisation (EM) clustering algorithm is used to locate the distinct classes of fluctuations, and a cluster tree mapping is used to visualise the results. Different classes of fluctuations in H-1 distinguished by this procedure are shown to be associated with MHD activity around separate resonant surfaces, with corresponding toroidal and poloidal mode numbers. Equally interesting are some clusters that don't exhibit this behaviour. (author)

  3. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    Science.gov (United States)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  4. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  5. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Science.gov (United States)

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms. PMID:26393595

  6. Automatic classification of communication logs into implementation stages via text analysis.

    Science.gov (United States)

    Wang, Dingding; Ogihara, Mitsunori; Gallo, Carlos; Villamar, Juan A; Smith, Justin D; Vermeer, Wouter; Cruden, Gracelyn; Benbow, Nanette; Brown, C Hendricks

    2016-09-06

    To improve the quality, quantity, and speed of implementation, careful monitoring of the implementation process is required. However, some health organizations have such limited capacity to collect, organize, and synthesize information relevant to its decision to implement an evidence-based program, the preparation steps necessary for successful program adoption, the fidelity of program delivery, and the sustainment of this program over time. When a large health system implements an evidence-based program across multiple sites, a trained intermediary or broker may provide such monitoring and feedback, but this task is labor intensive and not easily scaled up for large numbers of sites. We present a novel approach to producing an automated system of monitoring implementation stage entrances and exits based on a computational analysis of communication log notes generated by implementation brokers. Potentially discriminating keywords are identified using the definitions of the stages and experts' coding of a portion of the log notes. A machine learning algorithm produces a decision rule to classify remaining, unclassified log notes. We applied this procedure to log notes in the implementation trial of multidimensional treatment foster care in the California 40-county implementation trial (CAL-40) project, using the stages of implementation completion (SIC) measure. We found that a semi-supervised non-negative matrix factorization method accurately identified most stage transitions. Another computational model was built for determining the start and the end of each stage. This automated system demonstrated feasibility in this proof of concept challenge. We provide suggestions on how such a system can be used to improve the speed, quality, quantity, and sustainment of implementation. The innovative methods presented here are not intended to replace the expertise and judgement of an expert rater already in place. Rather, these can be used when human monitoring and

  7. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    Science.gov (United States)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  8. Is automatic CPAP titration as effective as manual CPAP titration in OSAHS patients? A meta-analysis.

    Science.gov (United States)

    Gao, Weijie; Jin, Yinghui; Wang, Yan; Sun, Mei; Chen, Baoyuan; Zhou, Ning; Deng, Yuan

    2012-06-01

    It is costly and time-consuming to conduct the standard manual titration to identify an effective pressure before continuous positive airway pressure (CPAP) treatment for obstructive sleep apnea (OSA) patients. Automatic titration is cheaper and more easily available than manual titration. The purpose of this systematic review was to evaluate the effect of automatic titration in identifying a pressure and on the improvement of apnea/hyponea index (AHI) and somnolence, the change of sleep quality, and the acceptance and compliance of CPAP treatment, compared with the manual titration. A systematic search was made of the PubMed, EMBASE, Cochrane Library, SCI, China Academic Journals Full-text Databases, Chinese Biomedical Literature Database, Chinese Scientific Journals Databases and Chinese Medical Association Journals. Randomized controlled trials comparing automatic titration and manual titration were reviewed. Studies were pooled to yield odds ratios (OR) or mean differences (MD) with 95% confidence intervals (CI). Ten trials involving 849 patients met the inclusion criteria. It is hard to identify a trend in the pressures determined by either automatic or manual titration. Automatic titration can improve the AHI (MD = 0.03/h, 95% CI = -4.48 to 4.53) and Epworth sleepiness scale (SMD = -0.02, 95% CI = -0.34 to 0.31,) as effectively as the manual titration. There is no difference between sleep architecture under automatic titration or manual titration. The acceptance of CPAP treatment (OR = 0.96, 95% CI = 0.60 to 1.55) and the compliance with treatment (MD = -0.04, 95% CI = -0.17 to 0.10) after automatic titration is not different from manual titration. Automatic titration is as effective as standard manual titration in improving AHI, somnolence while maintaining sleep quality similar to the standard method. In addition, automatic titration has the same effect on the acceptance and compliance of CPAP treatment as manual titration. With the potential advantage

  9. Statistical analysis of automatically detected ion density variations recorded by DEMETER and their relation to seismic activity

    Directory of Open Access Journals (Sweden)

    Michel Parrot

    2012-04-01

    Full Text Available

    Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.


  10. Large-scale tracking and classification for automatic analysis of cell migration and proliferation, and experimental optimization of high-throughput screens of neuroblastoma cells.

    Science.gov (United States)

    Harder, Nathalie; Batra, Richa; Diessl, Nicolle; Gogolin, Sina; Eils, Roland; Westermann, Frank; König, Rainer; Rohr, Karl

    2015-06-01

    Computational approaches for automatic analysis of image-based high-throughput and high-content screens are gaining increased importance to cope with the large amounts of data generated by automated microscopy systems. Typically, automatic image analysis is used to extract phenotypic information once all images of a screen have been acquired. However, also in earlier stages of large-scale experiments image analysis is important, in particular, to support and accelerate the tedious and time-consuming optimization of the experimental conditions and technical settings. We here present a novel approach for automatic, large-scale analysis and experimental optimization with application to a screen on neuroblastoma cell lines. Our approach consists of cell segmentation, tracking, feature extraction, classification, and model-based error correction. The approach can be used for experimental optimization by extracting quantitative information which allows experimentalists to optimally choose and to verify the experimental parameters. This involves systematically studying the global cell movement and proliferation behavior. Moreover, we performed a comprehensive phenotypic analysis of a large-scale neuroblastoma screen including the detection of rare division events such as multi-polar divisions. Major challenges of the analyzed high-throughput data are the relatively low spatio-temporal resolution in conjunction with densely growing cells as well as the high variability of the data. To account for the data variability we optimized feature extraction and classification, and introduced a gray value normalization technique as well as a novel approach for automatic model-based correction of classification errors. In total, we analyzed 4,400 real image sequences, covering observation periods of around 120 h each. We performed an extensive quantitative evaluation, which showed that our approach yields high accuracies of 92.2% for segmentation, 98.2% for tracking, and 86.5% for

  11. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    Science.gov (United States)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  12. Automatic Functional Harmonic Analysis

    NARCIS (Netherlands)

    de Haas, W.B.|info:eu-repo/dai/nl/304841250; Magalhães, J.P.; Wiering, F.|info:eu-repo/dai/nl/141928034; Veltkamp, R.C.|info:eu-repo/dai/nl/084742984

    2013-01-01

    Music scholars have been studying tonal harmony intensively for centuries, yielding numerous theories and models. Unfortunately, a large number of these theories are formulated in a rather informal fashion and lack mathematical precision. In this article we present HarmTrace, a functional model of

  13. Fluid Dynamic Analysis of Hand-Pump Infuser and UROMAT Endoscopic Automatic System for Irrigation Through a Flexible Ureteroscope.

    Science.gov (United States)

    Lama, Daniel J; Owyong, Michael; Parkhomenko, Egor; Patel, Roshan M; Landman, Jaime; Clayman, Ralph V

    2018-02-13

    To evaluate the flow characteristics produced by a manual and automated-pump irrigation system connected to a flexible ureteroscope. An in vitro analysis of a manual hand-pump infuser (HP) and the UROMAT Endoscopic Automatic System for Irrigation ® (E.A.S.I.) pump was performed. Standard irrigation tubing was used to connect a three-way valve to a flexible ureteroscope, the irrigation system, and a digital manometer. Flow rate and irrigation pressure measurements were recorded over a 15-minute period using pressure settings of 150 and 200 mm Hg for both irrigation pump systems. Once the HP was inflated to the initial pressure, it was not reinflated over the course of the trial. Data were collected with the working channel unoccupied and with placement of a 200 μm (0.6F) holmium laser fiber, 1.7F nitinol stone retrieval basket, or 2.67F guidewire. The difference in pressure measured at the site of inflow of irrigation to the ureteroscope was significantly greater using the HP compared to the E.A.S.I. pump at pressure settings of 150 mm Hg with and without the use of ureteroscopic instrumentation (p pump systems. The flow rates of irrigation produced by the HP and the E.A.S.I. pump are similar at pressures of 150 and 200 mm Hg irrespective of the occupancy of a ureteroscope's working channel during the first 5-minutes of irrigation. Irrigation pressure at the entry site of the ureteroscope is subject to significant variability with use of the HP compared to the E.A.S.I. pump irrigation system.

  14. Comparison of automatic procedures in the selection of peaks over threshold in flood frequency analysis: A Canadian case study in the context of climate change

    Science.gov (United States)

    Durocher, M.; Mostofi Zadeh, S.; Burn, D. H.; Ashkar, F.

    2017-12-01

    Floods are one of the most costly hazards and frequency analysis of river discharges is an important part of the tools at our disposal to evaluate their inherent risks and to provide an adequate response. In comparison to the common examination of annual streamflow maximums, peaks over threshold (POT) is an interesting alternative that makes better use of the available information by including more than one flood event per year (on average). However, a major challenge is the selection of a satisfactory threshold above which peaks are assumed to respect certain conditions necessary for an adequate estimation of the risk. Additionally, studies have shown that POT is also a valuable approach to investigate the evolution of flood regimes in the context of climate change. Recently, automatic procedures for the selection of the threshold were suggested to guide that important choice, which otherwise rely on graphical tools and expert judgment. Furthermore, having an automatic procedure that is objective allows for quickly repeating the analysis on a large number of samples, which is useful in the context of large databases or for uncertainty analysis based on a resampling approach. This study investigates the impact of considering such procedures in a case study including many sites across Canada. A simulation study is conducted to evaluate the bias and predictive power of the automatic procedures in similar conditions as well as investigating the power of derived nonstationarity tests. The results obtained are also evaluated in the light of expert judgments established in a previous study. Ultimately, this study provides a thorough examination of the considerations that need to be addressed when conducting POT analysis using automatic threshold selection.

  15. A Noise-Assisted Data Analysis Method for Automatic EOG-Based Sleep Stage Classification Using Ensemble Learning.

    Science.gov (United States)

    Olesen, Alexander Neergaard; Christensen, Julie A E; Sorensen, Helge B D; Jennum, Poul J

    2016-08-01

    Reducing the number of recording modalities for sleep staging research can benefit both researchers and patients, under the condition that they provide as accurate results as conventional systems. This paper investigates the possibility of exploiting the multisource nature of the electrooculography (EOG) signals by presenting a method for automatic sleep staging using the complete ensemble empirical mode decomposition with adaptive noise algorithm, and a random forest classifier. It achieves a high overall accuracy of 82% and a Cohen's kappa of 0.74 indicating substantial agreement between automatic and manual scoring.

  16. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    Science.gov (United States)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    Accurate knowledge of fabric anisotropy is crucial to understand the mechanical behavior of snow and firn, but is also important for understanding metamorphism. Computer-Integrated Polarization Microscopy (CIP) method used for the fabric analysis was developed by Heilbronner and Pauli in the early 1990ies and uses a slightly modified traditional polarization microscope for the fabric analysis. First developed for quartz, it can be applied to other uniaxial minerals. Up to now this method was mainly used in structural geology. However, it is also well suited for the fabric analysis of snow, firn and ice. The method is based on the analysis of first- order interference colors images by a slightly modified optical polarization microscope, a grayscale camera and a computer. The optical polarization microscope is featured with high quality objectives, a rotating table and two polarizers that can be introduced above and below the thin section, as well as a full wave plate. Additionally, two quarter-wave plates for circular polarization are needed. Otherwise it is also possible to create circular polarization from a set of crossed polarized images through image processing. A narrow band interference filter transmitting a wavelength between 660 and 700 nm is also required. Finally a monochrome digital camera is used to capture the input images. The idea is to record the change of interference colors while the thin section is being rotated once through 180°. The azimuth and inclination of the c-axis are defined by the color change. Recording the color change through a red filter produces a signal with a well-defined amplitude and phase angle. An advantage of this method lies in the simple conversion of an ordinary optical microscope to a fabric analyzer. The Automatic Ice Texture Analyzer (AITA) as the first fully functional instrument to measure c-axis orientation was developed by Wilson and other (2003). Most recent fabric analysis of snow and firn samples was carried

  17. An automatic image recognition approach

    Directory of Open Access Journals (Sweden)

    Tudor Barbu

    2007-07-01

    Full Text Available Our paper focuses on the graphical analysis domain. We propose an automatic image recognition technique. This approach consists of two main pattern recognition steps. First, it performs an image feature extraction operation on an input image set, using statistical dispersion features. Then, an unsupervised classification process is performed on the previously obtained graphical feature vectors. An automatic region-growing based clustering procedure is proposed and utilized in the classification stage.

  18. Sentiment Analysis and Social Cognition Engine (SEANCE): An automatic tool for sentiment, social cognition, and social-order analysis.

    Science.gov (United States)

    Crossley, Scott A; Kyle, Kristopher; McNamara, Danielle S

    2017-06-01

    This study introduces the Sentiment Analysis and Cognition Engine (SEANCE), a freely available text analysis tool that is easy to use, works on most operating systems (Windows, Mac, Linux), is housed on a user's hard drive (as compared to being accessed via an Internet interface), allows for batch processing of text files, includes negation and part-of-speech (POS) features, and reports on thousands of lexical categories and 20 component scores related to sentiment, social cognition, and social order. In the study, we validated SEANCE by investigating whether its indices and related component scores can be used to classify positive and negative reviews in two well-known sentiment analysis test corpora. We contrasted the results of SEANCE with those from Linguistic Inquiry and Word Count (LIWC), a similar tool that is popular in sentiment analysis, but is pay-to-use and does not include negation or POS features. The results demonstrated that both the SEANCE indices and component scores outperformed LIWC on the categorization tasks.

  19. Automatic method of analysis of OCT images in assessing the severity degree of glaucoma and the visual field loss.

    Science.gov (United States)

    Koprowski, Robert; Rzendkowski, Marek; Wróbel, Zygmunt

    2014-02-14

    In many practical aspects of ophthalmology, it is necessary to assess the severity degree of glaucoma in cases where, for various reasons, it is impossible to perform a visual field test - static perimetry. These are cases in which the visual field test result is not reliable, e.g. advanced AMD (Age-related Macular Degeneration). In these cases, there is a need to determine the severity of glaucoma, mainly on the basis of optic nerve head (ONH) and retinal nerve fibre layer (RNFL) structure. OCT is one of the diagnostic methods capable of analysing changes in both, ONH and RNFL in glaucoma. OCT images of the eye fundus of 55 patients (110 eyes) were obtained from the SOCT Copernicus (Optopol Tech. SA, Zawiercie, Poland). The authors proposed a new method for automatic determination of the RNFL (retinal nerve fibre layer) and other parameters using: mathematical morphology and profiled segmentation based on morphometric information of the eye fundus. A quantitative ratio of the quality of the optic disk and RNFL - BGA (biomorphological glaucoma advancement) was also proposed. The obtained results were compared with the results obtained from a static perimeter. Correlations between the known parameters of the optic disk as well as those suggested by the authors and the results obtained from static perimetry were calculated. The result of correlation with the static perimetry was 0.78 for the existing methods of image analysis and 0.86 for the proposed method. Practical usefulness of the proposed ratio BGA and the impact of the three most important features on the result were assessed. The following results of correlation for the three proposed classes were obtained: cup/disk diameter 0.84, disk diameter 0.97 and the RNFL 1.0. Thus, analysis of the supposed visual field result in the case of glaucoma is possible based only on OCT images of the eye fundus. The calculations and analyses performed with the proposed algorithm and BGA ratio confirm that it is possible to

  20. Vibration Theory, Vol. 1B

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Nielsen, Søren R. K.

    The present collection of MATLAB exercises has been published as a supplement to the textbook, Svingningsteori, Bind 1 and the collection of exercises in Vibration theory, Vol. 1A, Solved Problems. Throughout the exercise references are made to these books. The purpose of the MATLAB exercises...... is to give a better understanding of the physical problems in linear vibration theory and to surpress the mathematical analysis used to solve the problems. For this purpose the MATLAB environment is excellent....

  1. How well Do Phonological Awareness and Rapid Automatized Naming Correlate with Chinese Reading Accuracy and Fluency? A Meta-Analysis

    Science.gov (United States)

    Song, Shuang; Georgiou, George K.; Su, Mengmeng; Hua, Shu

    2016-01-01

    Previous meta-analyses on the relationship between phonological awareness, rapid automatized naming (RAN), and reading have been conducted primarily in English, an atypical alphabetic orthography. Here, we aimed to examine the association between phonological awareness, RAN, and word reading in a nonalphabetic language (Chinese). A random-effects…

  2. Trend analysis of nuclear reactor automatic trip events subjected to operator's human error at United States nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2009-01-01

    Trends in nuclear reactor automatic trip events due to human errors during plant operating mode have been analyzed by extracting 20 events which took place in the United States during the period of seven years from 2002 to 2008, cited in the LERs (Licensee Event Reports) submitted to the US Nuclear Regulatory Commission (NRC). It was shown that the yearly number of events was relatively large before 2005, and thereafter the number decreased. A period of stable operation, in which the yearly number was kept very small, continued for about three years, and then the yearly number turned to increase again. Before 2005, automatic trip events occurred more frequently during periodic inspections or start-up/shut-down operations. The recent trends, however, indicate that trip events became more frequent due to human errors during daily operations. Human errors were mostly caused by the self-conceit and carelessness of operators through the whole period. The before mentioned trends in the yearly number of events might be explained as follows. The decrease in the automatic trip events is attributed to sharing trouble information, leading as a consequence to improvement of the manual and training for the operations which have a higher potential risk of automatic trip. Then, while the period of stable operation continued, some operators came to pay less attention to preventing human errors and not interest in the training, leading to automatic trip events in reality due to miss-operation. From these analyses on trouble experiences in the US, we learnt the followings to prevent the occurrence similar troubles in Japan: Operators should be thoroughly skilled in basic actions to prevent human errors as persons concerned. And it should be further emphasized that they should elaborate by imaging actual plant operations even though the simulator training gives them successful experiences. (author)

  3. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Bourret, S.C.

    1974-01-01

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  4. Comparative analysis of different implementations of a parallel algorithm for automatic target detection and classification of hyperspectral images

    Science.gov (United States)

    Paz, Abel; Plaza, Antonio; Plaza, Javier

    2009-08-01

    Automatic target detection in hyperspectral images is a task that has attracted a lot of attention recently. In the last few years, several algoritms have been developed for this purpose, including the well-known RX algorithm for anomaly detection, or the automatic target detection and classification algorithm (ATDCA), which uses an orthogonal subspace projection (OSP) approach to extract a set of spectrally distinct targets automatically from the input hyperspectral data. Depending on the complexity and dimensionality of the analyzed image scene, the target/anomaly detection process may be computationally very expensive, a fact that limits the possibility of utilizing this process in time-critical applications. In this paper, we develop computationally efficient parallel versions of both the RX and ATDCA algorithms for near real-time exploitation of these algorithms. In the case of ATGP, we use several distance metrics in addition to the OSP approach. The parallel versions are quantitatively compared in terms of target detection accuracy, using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center in New York, five days after the terrorist attack of September 11th, 2001, and also in terms of parallel performance, using a massively Beowulf cluster available at NASA's Goddard Space Flight Center in Maryland.

  5. Contribution to automatic speech recognition. Analysis of the direct acoustical signal. Recognition of isolated words and phoneme identification

    International Nuclear Information System (INIS)

    Dupeyrat, Benoit

    1981-01-01

    This report deals with the acoustical-phonetic step of the automatic recognition of the speech. The parameters used are the extrema of the acoustical signal (coded in amplitude and duration). This coding method, the properties of which are described, is simple and well adapted to a digital processing. The quality and the intelligibility of the coded signal after reconstruction are particularly satisfactory. An experiment for the automatic recognition of isolated words has been carried using this coding system. We have designed a filtering algorithm operating on the parameters of the coding. Thus the characteristics of the formants can be derived under certain conditions which are discussed. Using these characteristics the identification of a large part of the phonemes for a given speaker was achieved. Carrying on the studies has required the development of a particular methodology of real time processing which allowed immediate evaluation of the improvement of the programs. Such processing on temporal coding of the acoustical signal is extremely powerful and could represent, used in connection with other methods an efficient tool for the automatic processing of the speech.(author) [fr

  6. Automatic Evaluation Of Interferograms

    Science.gov (United States)

    Becker, Friedhelm; Meier, Gerd E. A.; Wegner, Horst

    1983-03-01

    A system for the automatic evaluation of interference patterns has been developed. After digitizing the interferograms from classical and holografic interferometers with a television digitizer and performing different picture enhancement operations the fringe loci are extracted by use of a floating-threshold method. The fringes are numbered using a special scheme after the removal of any fringe disconnections which might appear if there was insufficient contrast in the interferograms. The reconstruction of the object function from the numbered fringe field is achieved by a local polynomial least-squares approximation. Applications are given, demonstrating the evaluation of interferograms of supersonic flow fields and the analysis of holografic interferograms of car-tyres.

  7. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR type reactor

    International Nuclear Information System (INIS)

    Benoist, P.; David, B.; Pigeon, M.

    1990-01-01

    An expert system for the automatic analysis of signals from Eddy currents is presented. The system was developed in order to detect and analyse the defects which may exist in vapor generators. The extraction of a signal from a high level background noise is possible. The organization of the work during the system's development, the results of the technique for the extraction of the signal from the background noise, and an example concerning the interpretation of the signal from a defect are presented [fr

  8. Calculation and analysis of the technical and economic parameters in the automatic technological control system of the WWER-440 reactor power unit

    International Nuclear Information System (INIS)

    Borisova, N.N.; Zhidkova, L.P.; Komarov, N.F.; Litvinov, V.K.; Ruzankov, V.N.; Sen'kin, V.I.; Spirina, A.A.

    1981-01-01

    An algorithm of calculation and analysis of technical specifications and economic parameters (TSEP) of the WWER-440 power unit is described. The algorithm permits to automatically obtain TSEP characterizing real and norminal efficiency of the power unit and its elements; to determine the structure of fuel overconsumption; to differentially present data on thermal efficiency of the power unit equipment to different control levels. The algorithm flowsheet is presented. The measuring system of parameters necessary for calculations as well as for reliability control of the input and output data, data presentation to the personnel are described [ru

  9. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    is 2- to 15-fold more efficient than the common systematic, uniformly random sampling. The simulations also indicate that the lack of a simple predictor of the coefficient of error (CE) due to field-to-field variation is a more severe problem for uniform sampling strategies than anticipated. Because...... of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to be automatically estimated (not just predicted), unbiased - for all estimators and at no extra cost to the user....

  10. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  11. Automatic gallbladder segmentation using combined 2D and 3D shape features to perform volumetric analysis in native and secretin-enhanced MRCP sequences.

    Science.gov (United States)

    Gloger, Oliver; Bülow, Robin; Tönnies, Klaus; Völzke, Henry

    2017-11-24

    We aimed to develop the first fully automated 3D gallbladder segmentation approach to perform volumetric analysis in volume data of magnetic resonance (MR) cholangiopancreatography (MRCP) sequences. Volumetric gallbladder analysis is performed for non-contrast-enhanced and secretin-enhanced MRCP sequences. Native and secretin-enhanced MRCP volume data were produced with a 1.5-T MR system. Images of coronal maximum intensity projections (MIP) are used to automatically compute 2D characteristic shape features of the gallbladder in the MIP images. A gallbladder shape space is generated to derive 3D gallbladder shape features, which are then combined with 2D gallbladder shape features in a support vector machine approach to detect gallbladder regions in MRCP volume data. A region-based level set approach is used for fine segmentation. Volumetric analysis is performed for both sequences to calculate gallbladder volume differences between both sequences. The approach presented achieves segmentation results with mean Dice coefficients of 0.917 in non-contrast-enhanced sequences and 0.904 in secretin-enhanced sequences. This is the first approach developed to detect and segment gallbladders in MR-based volume data automatically in both sequences. It can be used to perform gallbladder volume determination in epidemiological studies and to detect abnormal gallbladder volumes or shapes. The positive volume differences between both sequences may indicate the quantity of the pancreatobiliary reflux.

  12. Quantitative Analysis of Heavy Metals in Water Based on LIBS with an Automatic Device for Sample Preparation

    International Nuclear Information System (INIS)

    Hu Li; Zhao Nanjing; Liu Wenqing; Meng Deshuo; Fang Li; Wang Yin; Yu Yang; Ma Mingjun

    2015-01-01

    Heavy metals in water can be deposited on graphite flakes, which can be used as an enrichment method for laser-induced breakdown spectroscopy (LIBS) and is studied in this paper. The graphite samples were prepared with an automatic device, which was composed of a loading and unloading module, a quantitatively adding solution module, a rapid heating and drying module and a precise rotating module. The experimental results showed that the sample preparation methods had no significant effect on sample distribution and the LIBS signal accumulated in 20 pulses was stable and repeatable. With an increasing amount of the sample solution on the graphite flake, the peak intensity at Cu I 324.75 nm accorded with the exponential function with a correlation coefficient of 0.9963 and the background intensity remained unchanged. The limit of detection (LOD) was calculated through linear fitting of the peak intensity versus the concentration. The LOD decreased rapidly with an increasing amount of sample solution until the amount exceeded 20 mL and the correlation coefficient of exponential function fitting was 0.991. The LOD of Pb, Ni, Cd, Cr and Zn after evaporating different amounts of sample solution on the graphite flakes was measured and the variation tendency of their LOD with sample solution amounts was similar to the tendency for Cu. The experimental data and conclusions could provide a reference for automatic sample preparation and heavy metal in situ detection. (paper)

  13. Analysis of hydrogen and methane in seawater by "Headspace" method: Determination at trace level with an automatic headspace sampler.

    Science.gov (United States)

    Donval, J P; Guyader, V

    2017-01-01

    "Headspace" technique is one of the methods for the onboard measurement of hydrogen (H 2 ) and methane (CH 4 ) in deep seawater. Based on the principle of an automatic headspace commercial sampler, a specific device has been developed to automatically inject gas samples from 300ml syringes (gas phase in equilibrium with seawater). As valves, micro pump, oven and detector are independent, a gas chromatograph is not necessary allowing a reduction of the weight and dimensions of the analytical system. The different steps from seawater sampling to gas injection are described. Accuracy of the method is checked by a comparison with the "purge and trap" technique. The detection limit is estimated to 0.3nM for hydrogen and 0.1nM for methane which is close to the background value in deep seawater. It is also shown that this system can be used to analyze other gases such as Nitrogen (N 2 ), carbon monoxide (CO), carbon dioxide (CO 2 ) and light hydrocarbons. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Automatic selection of localized region-based active contour models using image content analysis applied to brain tumor segmentation.

    Science.gov (United States)

    Ilunga-Mbuyamba, Elisee; Avina-Cervantes, Juan Gabriel; Cepeda-Negrete, Jonathan; Ibarra-Manzano, Mario Alberto; Chalopin, Claire

    2017-12-01

    Brain tumor segmentation is a routine process in a clinical setting and provides useful information for diagnosis and treatment planning. Manual segmentation, performed by physicians or radiologists, is a time-consuming task due to the large quantity of medical data generated presently. Hence, automatic segmentation methods are needed, and several approaches have been introduced in recent years including the Localized Region-based Active Contour Model (LRACM). There are many popular LRACM, but each of them presents strong and weak points. In this paper, the automatic selection of LRACM based on image content and its application on brain tumor segmentation is presented. Thereby, a framework to select one of three LRACM, i.e., Local Gaussian Distribution Fitting (LGDF), localized Chan-Vese (C-V) and Localized Active Contour Model with Background Intensity Compensation (LACM-BIC), is proposed. Twelve visual features are extracted to properly select the method that may process a given input image. The system is based on a supervised approach. Applied specifically to Magnetic Resonance Imaging (MRI) images, the experiments showed that the proposed system is able to correctly select the suitable LRACM to handle a specific image. Consequently, the selection framework achieves better accuracy performance than the three LRACM separately. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Automatic fluid dispenser

    Science.gov (United States)

    Sakellaris, P. C. (Inventor)

    1977-01-01

    Fluid automatically flows to individual dispensing units at predetermined times from a fluid supply and is available only for a predetermined interval of time after which an automatic control causes the fluid to drain from the individual dispensing units. Fluid deprivation continues until the beginning of a new cycle when the fluid is once again automatically made available at the individual dispensing units.

  16. University of Mauritius Research Journal - Vol 4 (2002)

    African Journals Online (AJOL)

    University of Mauritius Research Journal - Vol 4 (2002). Journal Home > Archives > Vol 4 ... Growth And Export Expansion In Mauritius - A Time Series Analysis. R.V. Sannassee, R Pearce ... The Historical Development of the Mixed Legal System of Mauritius during the French and British Colonial Periods. PR Domingue ...

  17. Development of automatic extraction of the corpus callosum from magnetic resonance imaging of the head and examination of the early dementia objective diagnostic technique in feature analysis

    International Nuclear Information System (INIS)

    Kodama, Naoki; Kaneko, Tomoyuki

    2005-01-01

    We examined the objective diagnosis of dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 17 early dementia patients (2 men and 15 women; mean age, 77.2±3.3 years) and 18 healthy elderly controls (2 men and 16 women; mean age, 73.8±6.5 years), 35 subjects altogether. First, the corpus callosum was automatically extracted from the MR images. Next, early dementia was compared with the healthy elderly individuals using 5 features of the straight-line methods, 5 features of the Run-Length Matrix, and 6 features of the Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum showed an accuracy rate of 84.1±3.7%. A statistically significant difference was found in 6 of the 16 features between early dementia patients and healthy elderly controls. Discriminant analysis using the 6 features demonstrated a sensitivity of 88.2% and specificity of 77.8%, with an overall accuracy of 82.9%. These results indicate that feature analysis based on changes in the corpus callosum can be used as an objective diagnostic technique for early dementia. (author)

  18. Automatic extraction of corpus callosum from midsagittal head MR image and examination of Alzheimer-type dementia objective diagnostic system in feature analysis

    International Nuclear Information System (INIS)

    Kaneko, Tomoyuki; Kodama, Naoki; Kaeriyama, Tomoharu; Fukumoto, Ichiro

    2004-01-01

    We studied the objective diagnosis of Alzheimer-type dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 40 Alzheimer-type dementia patients (15 men and 25 women; mean age, 75.4±5.5 years) and 31 healthy elderly persons (10 men and 21 women; mean age, 73.4±7.5 years), 71 subjects altogether. First, the corpus callosum was automatically extracted from midsagittal head MR images. Next, Alzheimer-type dementia was compared with the healthy elderly individuals using the features of shape factor and six features of Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum succeeded in 64 of 71 individuals, for an extraction rate of 90.1%. A statistically significant difference was found in 7 of the 9 features between Alzheimer-type dementia patients and the healthy elderly adults. Discriminant analysis using the 7 features demonstrated a sensitivity rate of 82.4%, specificity of 89.3%, and overall accuracy of 85.5%. These results indicated the possibility of an objective diagnostic system for Alzheimer-type dementia using feature analysis based on change in the corpus callosum. (author)

  19. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  20. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  1. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult

  2. TU-H-CAMPUS-JeP2-05: Can Automatic Delineation of Cardiac Substructures On Noncontrast CT Be Used for Cardiac Toxicity Analysis?

    International Nuclear Information System (INIS)

    Luo, Y; Liao, Z; Jiang, W; Gomez, D; Williamson, R; Court, L; Yang, J

    2016-01-01

    Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD) to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the clinical

  3. TU-H-CAMPUS-JeP2-05: Can Automatic Delineation of Cardiac Substructures On Noncontrast CT Be Used for Cardiac Toxicity Analysis?

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y; Liao, Z; Jiang, W; Gomez, D; Williamson, R; Court, L; Yang, J [MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD) to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the clinical

  4. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  5. A fully-automatic caudate nucleus segmentation of brain MRI: Application in volumetric analysis of pediatric attention-deficit/hyperactivity disorder

    Directory of Open Access Journals (Sweden)

    Igual Laura

    2011-12-01

    Full Text Available Abstract Background Accurate automatic segmentation of the caudate nucleus in magnetic resonance images (MRI of the brain is of great interest in the analysis of developmental disorders. Segmentation methods based on a single atlas or on multiple atlases have been shown to suitably localize caudate structure. However, the atlas prior information may not represent the structure of interest correctly. It may therefore be useful to introduce a more flexible technique for accurate segmentations. Method We present Cau-dateCut: a new fully-automatic method of segmenting the caudate nucleus in MRI. CaudateCut combines an atlas-based segmentation strategy with the Graph Cut energy-minimization framework. We adapt the Graph Cut model to make it suitable for segmenting small, low-contrast structures, such as the caudate nucleus, by defining new energy function data and boundary potentials. In particular, we exploit information concerning the intensity and geometry, and we add supervised energies based on contextual brain structures. Furthermore, we reinforce boundary detection using a new multi-scale edgeness measure. Results We apply the novel CaudateCut method to the segmentation of the caudate nucleus to a new set of 39 pediatric attention-deficit/hyperactivity disorder (ADHD patients and 40 control children, as well as to a public database of 18 subjects. We evaluate the quality of the segmentation using several volumetric and voxel by voxel measures. Our results show improved performance in terms of segmentation compared to state-of-the-art approaches, obtaining a mean overlap of 80.75%. Moreover, we present a quantitative volumetric analysis of caudate abnormalities in pediatric ADHD, the results of which show strong correlation with expert manual analysis. Conclusion CaudateCut generates segmentation results that are comparable to gold-standard segmentations and which are reliable in the analysis of differentiating neuroanatomical abnormalities

  6. Automatic classification of sources of volcanic tremors at the Klyuchevskoy volcanic group (Kamchatka) based on the seismic network covariance matrix analysis

    Science.gov (United States)

    Soubestre, Jean; Shapiro, Nikolai M.; Seydoux, Léonard; de Rosny, Julien; Droznin, Dimitry V.; Droznina, Svetlana Ya.; Senyukov, Sergey L.; Gordeev, Evgeny I.

    2017-04-01

    Volcanic tremors may be caused by magma moving through narrow fractures, by fragmentation and pulsation of pressurized fluids within the volcano, or by escape of pressurized steam and gases from fumaroles. They present an important attribute of the volcanic unrest and their detection and characterization is used in volcano monitoring systems. The tremors might be generated within different parts of volcanoes and might characterize different types of volcanic activity. The main goal of the present study is to develop a method of automatic classification of different types (sources) of tremors based on analysis of continuous records of a network of seismographs. The proposed method is based on the analysis of eigenvalues and eigenvectors of the seismic array covariance matrix. First, we followed an approach developed by Seydoux et al. (2016) and analyzed the width of the covariance matrix eigenvalues distribution to detect time periods with strong volcanic tremors. In a next step, we analyzed the frequency-dependent eigenvectors of the covariance matrix. The eigenvectors corresponding to strongest eigenvalues can be used as fingerprints of dominating seismic sources during the period over which the covariance matrix was calculated. We applied the method to the data recorded by the permanent seismic monitoring network composed of 19 stations operated in the vicinity of the Klyuchevskoy group of volcanoes (KVG) located in Kamchatka, Russia. The KVG is composed of 13 stratovolcanoes with 3 of them (Klyuchevskoy, Bezymianny, and Tolbachik) being very active during last decades. In addition, two other active volcanoes, Shiveluch and Kizimen, are located immediately north and south of KVG. This exceptional concentration of active volcanoes provides us with a multiplicity of seismic tremor sources required to validate the method. We used 4.5 years of vertical component records by 19 stations and computed network covariance matrices from day-long windows. We then analyzed

  7. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  8. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  9. Information manager vol 10

    African Journals Online (AJOL)

    Library _info_Sc_ 1

    The Information Manager Vol.10 (1 & 2) 2010. Page 9. Conceptual Art: Perceptions And Appearances by ... perception and appearances of conceptual art. Introduction. Conceptual art is largely seen as an aspect of ..... The creation of conceptual art works can generate sensation, for they are works not commonly practiced in ...

  10. JMBR vol 7.cdr

    African Journals Online (AJOL)

    Tope

    Hence, it may be necessary to use rheomodulators in the management of diabetes mellitus. INTRODUCTION. Cardiovascular morbidity and mortality represent a main challenge in diabetic. 1, 2 patients . Aggressive blood pressure control. JMBR: A Peer-review Journal of Biomedical Sciences. 2008 Edition Vol.7 Nos.1 & 2.

  11. Automatic inverse methods for the analysis of pulse tests: application to four pulse tests at the Leuggern borehole

    International Nuclear Information System (INIS)

    Carrera, J.; Samper, J.; Vives, L.; Kuhlmann, U.

    1989-07-01

    Four pulse tests performed for NAGRA at the Leuggern borehole were analyzed using automatic test sequence matching techniques. Severe identifiability problems were unveiled during the process. Because of these problems, the identifiability of aquifer parameters (hydraulic conductivity, storativity and skin conductivity) from pulse tests similar to those performed in the Leuggern borehole was studied in two synthetic examples. The first of these had a positive skin effect and the second had a negative skin effect. These synthetic examples showed that, for the test conditions at the Leuggern borehole, estimating formation hydraulic conductivity may be nearly impossible for the cases of positive and negative skin factors. In addition, identifiability appears to be quite sensitive to the values of the parameters and to other factors such as skin thickness. Nevertheless, largely because of the manner in which the tests were conducted (i.e. relatively long injection time and performance of both injection and withdrawal), identifiability of the actual tests was much better than suggested by the synthetic examples. Only one of the four tests was nearly nonidentifiable. In all, the match between measured and computed aquifer responses was excellent for all the tests, and formation hydraulic conductivities were estimated within a relatively narrow uncertainty interval. (author) 19 refs., 59 figs., 28 tabs

  12. Contribution to automatic image recognition. Application to analysis of plain scenes of overlapping parts in robot technology

    International Nuclear Information System (INIS)

    Tan, Shengbiao

    1987-01-01

    A method for object modeling and overlapped object automatic recognition is presented. Our work is composed of three essential parts: image processing, object modeling, and evaluation of the implementation of the stated concepts. In the first part, we present a method of edge encoding which is based on a re-sampling of the data encoded according to Freeman, this method generates an isotropic, homogenous and very precise representation. The second part relates to object modeling. This important step makes much easier the recognition work. The new method proposed characterizes a model with two groups of information: the description group containing the primitives, the discrimination group containing data packs, called 'transition vectors'. Based on this original method of information organization, a 'relative learning' is able to select, to ignore and to update the information concerning the objects already learned, according to the new information to be included into the data base. The recognition is a two-pass process: the first pass determines very efficiently the presence of objects by making use of each object's particularities, and this hypothesis is either confirmed or rejected by the following fine verification pass. The last part describes in detail the experimentation results. We demonstrate the robustness of the algorithms with images in both poor lighting and overlapping objects conditions. The system, named SOFIA, has been installed into an industrial vision system series and works in real time. (author) [fr

  13. Development of an automatic scanning system for nuclear emulsion analysis in the OPERA experiment and study of neutrino interactions location

    International Nuclear Information System (INIS)

    Arrabito, L.

    2007-10-01

    Following Super Kamiokande and K2K experiments, Opera (Oscillation Project with Emulsion tracking Apparatus), aims to confirm neutrino oscillation in the atmospheric sector. Taking advantage of a technique already employed in Chorus and in Donut, the Emulsion Cloud Chamber (ECC), Opera will be able to observe the ν μ → ν τ oscillation, through the ν τ appearance in a pure ν μ beam. The Opera experiment, with its ∼ 100000 m 2 of nuclear emulsions, needs a very fast automatic scanning system. Optical and mechanics components have been customized in order to achieve a speed of about 20 cm 2 /hour per emulsion layer (44 μm thick), while keeping a sub-micro-metric resolution. The first part of this thesis was dedicated to the optimization of 4 scanning systems at the French scanning station, based in Lyon. An experimental study on a dry objective scanning system has also been realized. The obtained results show that the performances of dry scanning are similar with respect to the traditional oil scanning, so that it can be successfully used for Opera. The second part of this work was devoted to the study of the neutrino interaction location and reconstruction strategy actually used in Opera. A dedicated test beam was performed at CERN in order to simulate Opera conditions. The obtained results definitely confirm that the proposed strategy is well adapted for tau search. (author)

  14. On the Selection of Non-Invasive Methods Based on Speech Analysis Oriented to Automatic Alzheimer Disease Diagnosis

    Directory of Open Access Journals (Sweden)

    Unai Martinez de Lizardui

    2013-05-01

    Full Text Available The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN have been used for the automatic classification of the two classes (AD and control subjects. Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

  15. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  16. Automatic method of analysis of OCT images in the assessment of the tooth enamel surface after orthodontic treatment with fixed braces.

    Science.gov (United States)

    Koprowski, Robert; Machoy, Monika; Woźniak, Krzysztof; Wróbel, Zygmunt

    2014-04-22

    Fixed orthodontic appliances, despite years of research and development, still raise a lot of controversy because of its potentially destructive influence on enamel. Therefore, it is necessary to quantitatively assess the condition and therein the thickness of tooth enamel in order to select the appropriate orthodontic bonding and debonding methodology as well as to assess the quality of enamel after treatment and clean-up procedure in order to choose the most advantageous course of treatment. One of the assessment methods is computed tomography where the measurement of enamel thickness and the 3D reconstruction of image sequences can be performed fully automatically. OCT images of 180 teeth were obtained from the Topcon 3D OCT-2000 camera. The images were obtained in vitro by performing sequentially 7 stages of treatment on all the teeth: before any interference into enamel, polishing with orthodontic paste, etching and application of a bonding system, orthodontic bracket bonding, orthodontic bracket removal, cleaning off adhesive residue. A dedicated method for the analysis and processing of images involving median filtering, mathematical morphology, binarization, polynomial approximation and the active contour method has been proposed. The obtained results enable automatic measurement of tooth enamel thickness in 5 seconds using the Core i5 CPU M460 @ 2.5GHz 4GB RAM. For one patient, the proposed method of analysis confirms enamel thickness loss of 80 μm (from 730 ± 165 μm to 650 ± 129 μm) after polishing with paste, enamel thickness loss of 435 μm (from 730 ± 165 μm to 295 ± 55 μm) after etching and bonding resin application, growth of a layer having a thickness of 265 μm (from 295 ± 55 μm to 560 ± 98 μm after etching) which is the adhesive system. After removing an orthodontic bracket, the adhesive residue was 105 μm and after cleaning it off, the enamel thickness was 605 μm. The enamel thickness before and

  17. Large Scale Automatic Analysis and Classification of Roof Surfaces for the Installation of Solar Panels Using a Multi-Sensor Aerial Platform

    Directory of Open Access Journals (Sweden)

    Luis López-Fernández

    2015-09-01

    Full Text Available A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbor solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the areas, tilts, orientations and the existence of obstacles to locate the optimal zones inside each roof surface for the installation of solar panels. This information is complemented with the estimation of the solar irradiation received by each surface. This way, large areas may be efficiently analyzed obtaining as final result the optimal locations for the placement of solar panels as well as the information necessary (location, orientation, tilt, area and solar irradiation to estimate the productivity of a solar panel from its technical characteristics.

  18. The IDA-80 measurement evaluation programme on mass spectrometric isotope dilution analysis of uranium and plutonium. Vol. 1

    International Nuclear Information System (INIS)

    Beyrich, W.; Golly, W.; Spannagel, G.; Kernforschungszentrum Karlsruhe G.m.b.H.; Bievre, P. de; Wolters, W.

    1984-12-01

    The main objective was the acquisition of basic data on the uncertainties involved in the mass spectrometric isotope dilution analysis as applied to the determination of uranium and plutonium in active feed solutions of reprocessing plants. The element concentrations and isotopic compositions of all test materials used were determined by CBNM and NBS with high accuracy. The more than 60000 analytical data reported by the participating laboratories were evaluated by statistical methods applied mainly to the calculation of estimates of the variances for the different uncertainty components contributing to the total uncertainty of this analytical technique. Attention was given to such topics as sample ageing, influence of fission products, spike calibration, ion fractionation, Pu-241 decay correction, minor isotope measurement and errors in data transfer. Furthermore, the performance of the 'dried sample' technique and the 'in-situ' spiking method of undiluted samples of reprocessing fuel solution with U-235/Pu-242 metal alloy spikes, were tested successfully. Considerable improvement of isotope dilution analysis in this safeguards relevant application during the last decade is shown as compared to the results obtained in the IDA-72 interlaboratory experiment, organized by KfK in 1972 on the same subject. (orig./HP) [de

  19. Capillary electrophoresis enhanced by automatic two-way background correction using cubic smoothing splines and multivariate data analysis applied to the characterisation of mixtures of surfactants.

    Science.gov (United States)

    Bernabé-Zafón, Virginia; Torres-Lapasió, José R; Ortega-Gadea, Silvia; Simó-Alfonso, Ernesto F; Ramis-Ramos, Guillermo

    2005-02-18

    Mixtures of the surfactant classes coconut diethanolamide, cocamido propyl betaine and alkylbenzene sulfonate were separated by capillary electrophoresis in several media containing organic solvents and anionic solvophobic agents. Good resolution between both the surfactant classes and the homologues within the classes was achieved in a BGE containing 80 mM borate buffer of pH 8.5, 20% n-propanol and 40 mM sodium deoxycholate. Full resolution, assistance in peak assignment to the classes (including the recognition of solutes not belonging to the classes), and improvement of the signal-to-noise ratio was achieved by multivariate data analysis of the time-wavelength electropherograms. Cubic smoothing splines were used to develop an algorithm capable of automatically modelling the two-way background, which increased the sensitivity and reliability of the multivariate analysis of the corrected signal. The exclusion of significant signals from the background model was guaranteed by the conservativeness of the criteria used and the safeguards adopted all along the point selection process, where the CSS algorithm supported the addition of new points to the initially reduced background sample. Efficient background modelling made the application of multivariate deconvolution within extensive time windows possible. This increased the probability of finding quality spectra for each solute class by orthogonal projection approach. The concentration profiles of the classes were improved by subsequent application of alternating least squares. The two-way electropherograms were automatically processed, with minimal supervision by the user, in less than 2 min. The procedure was successfully applied to the identification and quantification of the surfactants in household cleaners.

  20. Inter-observer reproducibility of semi-automatic tumor diameter measurement and volumetric analysis in patients with lung cancer.

    Science.gov (United States)

    Dinkel, J; Khalilzadeh, O; Hintze, C; Fabel, M; Puderbach, M; Eichinger, M; Schlemmer, H-P; Thorn, M; Heussel, C P; Thomas, M; Kauczor, H-U; Biederer, J

    2013-10-01

    Therapy monitoring in oncologic patient requires precise measurement methods. In order to improve the precision of measurements, we used a semi-automated generic segmentation algorithm to measure the size of large lung cancer tumors. The reproducibility of computer-assisted measurements were assessed and compared with manual measurements. CT scans of 24 consecutive lung cancer patients who were referred to our hospital over a period of 6 months were analyzed. The tumor sizes were measured manually by 3 independent radiologists, according to World Health Organization (WHO) and the Revised Response Evaluation Criteria in Solid Tumors (RECIST) guidelines. At least 10 months later, measurements were repeated semi-automatically on the same scans by the same radiologists. The inter-observer reproducibility of all measurements was assessed and compared between manual and semi-automated measurements. Manual measurements of the tumor longest diameter were significantly (p < 0.05) smaller compared with the semi-automated measurements. The intra-rater correlations coefficients were significantly higher for measurements of longest diameter (intra-class correlation coefficients: 0.998 vs. 0.986; p < 0.001) and area (0.995 vs. 0.988; p = 0.032) using semi-automated compared with manual method. The variation coefficient for manual measurement of the tumor area (WHO guideline, 15.7% vs. 7.3%) and the longest diameter (RECIST guideline, 7.7% vs. 2.7%) was 2-3 times that of semi-automated measurement. By using computer-assisted size assessment in primary lung tumor, interobserver-variability can be reduced to about half to one-third compared to standard manual measurements. This indicates a high potential value for therapy monitoring in lung cancer patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Automatic content-based analysis of georeferenced image data: Detection of Beggiatoa mats in seafloor video mosaics from the HÅkon Mosby Mud Volcano

    Science.gov (United States)

    Jerosch, K.; Lüdtke, A.; Schlüter, M.; Ioannidis, G. T.

    2007-02-01

    The combination of new underwater technology as remotely operating vehicles (ROVs), high-resolution video imagery, and software to compute georeferenced mosaics of the seafloor provides new opportunities for marine geological or biological studies and applications in offshore industry. Even during single surveys by ROVs or towed systems large amounts of images are compiled. While these underwater techniques are now well-engineered, there is still a lack of methods for the automatic analysis of the acquired image data. During ROV dives more than 4200 georeferenced video mosaics were compiled for the HÅkon Mosby Mud Volcano (HMMV). Mud volcanoes as HMMV are considered as significant source locations for methane characterised by unique chemoautotrophic communities as Beggiatoa mats. For the detection and quantification of the spatial distribution of Beggiatoa mats an automated image analysis technique was developed, which applies watershed transformation and relaxation-based labelling of pre-segmented regions. Comparison of the data derived by visual inspection of 2840 video images with the automated image analysis revealed similarities with a precision better than 90%. We consider this as a step towards a time-efficient and accurate analysis of seafloor images for computation of geochemical budgets and identification of habitats at the seafloor.

  2. A new and fast methodology to assess oxidative damage in cardiovascular diseases risk development through eVol-MEPS-UHPLC analysis of four urinary biomarkers.

    Science.gov (United States)

    Mendes, Berta; Silva, Pedro; Mendonça, Isabel; Pereira, Jorge; Câmara, José S

    2013-11-15

    In this work, a new, fast and reliable methodology using a digitally controlled microextraction by packed sorbent (eVol(®)-MEPS) followed by ultra-high pressure liquid chromatography (UHPLC) analysis with photodiodes (PDA) detection, was developed to establish the urinary profile levels of four putative oxidative stress biomarkers (OSBs) in healthy subjects and patients evidencing cardiovascular diseases (CVDs). This data was used to verify the suitability of the selected OSBs (uric acid-UAc, malondialdehyde-MDA, 5-(hydroxymethyl)uracil-5-HMUra and 8-hydroxy-2'-deoxyguanosine-8-oxodG) as potential biomarkers of CVDs progression. Important parameters affecting the efficiency of the extraction process were optimized, particularly stationary phase selection, pH influence, sample volume, number of extraction cycles and washing and elution volumes. The experimental conditions that allowed the best extraction efficiency, expressed in terms of total area of the target analytes and data reproducibility, includes a 10 times dilution and pH adjustment of the urine samples to 6.0, followed by a gradient elution through the C8 adsorbent with 5 times 50 µL of 0.01% formic acid and 3×50 µL of 20% methanol in 0.01% formic acid. The chromatographic separation of the target analytes was performed with a HSS T3 column (100 mm × 2.1 mm, 1.7 µm in particle size) using 0.01% formic acid 20% methanol at 250 µL min(-1). The methodology was validated in terms of selectivity, linearity, instrumental limit of detection (LOD), method limit of quantification (LOQ), matrix effect, accuracy and precision (intra-and inter-day). Good results were obtained in terms of selectivity and linearity (r(2)>0.9906), as well as the LOD and LOQ, whose values were low, ranging from 0.00005 to 0.72 µg mL(-1) and 0.00023 to 2.31 µg mL(-1), respectively. The recovery results (91.1-123.0%), intra-day (1.0-8.3%), inter-day precision (4.6-6.3%) and the matrix effect (60.1-110.3%) of eVol

  3. Petroleum product refining: plant level analysis of costs and competitiveness. Implications of greenhouse gas emission reductions. Vol 1

    International Nuclear Information System (INIS)

    Kelly, S.J.; Crandall, G.R.; Houlton, G.A.; Kromm, R.B.

    1999-01-01

    Implications on the Canadian refining industry of reducing greenhouse gas (GHG) emissions to meet Canada's Kyoto commitment are assessed, based on a plant-level analysis of costs, benefits and economic and competitive impacts. It was determined on the basis of demand estimates prepared by Natural Resources Canada that refining industry carbon dioxide emissions could be as much a 38 per cent higher than 1990 levels in 2010. Achieving a six per cent reduction below 1990 levels from this business-as-usual case is considered a very difficult target to achieve, unless refinery shutdowns occur. This would require higher imports to meet Canada's petroleum products demand, leaving total carbon dioxide emissions virtually unchanged. A range of options, classified as (1) low capital, operating efficiency projects, (2) medium capital, process/utility optimization projects, (3) high capital, refinery specific projects, and (4) high operating cost GHG projects, were evaluated. Of these four alternatives, the low capital or operating efficiency projects were the only ones judged to have the potential to be economically viable. Energy efficiency projects in these four groups were evaluated under several policy initiatives including accelerated depreciation and a $200 per tonne of carbon tax. Result showed that an accelerated depreciation policy would lower the hurdle rate for refinery investments, and could achieve a four per cent reduction in GHG emissions below 1990 levels, assuming no further shutdown of refinery capacity. The carbon tax was judged to be potentially damaging to the Canadian refinery industry since it would penalize cracking refineries (most Canadian refineries are of this type); it would provide further uncertainty and risk, such that industry might not be able to justify investments to reduce emissions. The overall assessment is that the Canadian refinery industry could not meet the pro-rata Kyoto GHG reduction target through implementation of economically

  4. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  5. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  6. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  7. AUTOMATIC INTRAVENOUS DRIP CONTROLLER*

    African Journals Online (AJOL)

    Both the nursing staff shortage and the need for precise control in the administration of dangerous drugs intra- venously have led to the development of various devices to achieve an automatic system. The continuous automatic control of the drip rate eliminates errors due to any physical effect such as movement of the ...

  8. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  9. Morphotectonic mapping from the analysis of automatically extracted lineaments using Landsat 8 images and SRTM data in the Hindukush-Pamir

    Science.gov (United States)

    Rahnama, Mehdi; Gloaguen, Richard

    2014-05-01

    Modern deformation, fault movements, and induced earthquakes in the Hindukush-Pamir region are driven by the collision between the northward-moving Indian subcontinent and Eurasia. We investigated neotectonic activity and generated tectonic maps of this area. We developed a Matlab based toolbox for the automatic extraction of image discontinuities. The approach consists of frequency domain filtering, edge detection in the spatial domain, Hough transformation, segment grouping, polynomial interpolation and geostatistical analysis of the lineaments patterns. Statistical quantification of counts, lengths, azimuth frequency, density distribution, and orientations are analyzed to understand the tectonic activities, to explain the prominent structural trends, and to demarcate the contribution of different faulting styles. Morphotectonic lineaments observed on the study area were automatically extracted from panchromatic band of Landsat 8 with 15-m resolution and SRTM digital elevation model (DEM) with 90-m resolution. Then, this data was analyzed to characterize the tectonic trends that dominated the geologic evolution of this area. We show that the SW-Pamir is mainly controlled by the Chaman-Herat-Central Badakhshan fault systems and, to a lesser extent by the Darvaz fault zone. Extracted lineaments and the intensity of the characterized tectonic trends correspond well with reference data. In Addition, results are consistent with the styles of faulting determined from focal mechanisms of the historical earthquake epicenters in the region. The presented results could be applicable in different geological aspects that are based on a good knowledge of the system patterns and the spatial relationship between them. These aspects included geodynamics, seismic and risk assessment, mineral exploration and hydrogeological research.

  10. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  11. [Automatic morphometric analysis as a method for determining the level of extracellular matrix components and for quantifying nuclear antigens].

    Science.gov (United States)

    Gumenyuk, I S; Chuprinenko, L M; Sotnichenko, A S; Gaivoronskaya, T V; Gumenyuk, S E; Gubareva, E A; Kuevda, E V; Krutova, V A; Alekseenko, S N

    Automated image analysis methods are highly important for biotechnology research. The authors developed and tested a program for the morphometric analysis of photomicrographs of the sections processed using the standard immunohistochemical examination protocols. The color deconvolution method used in the algorithm was proven to be effective in mapping the distribution of DAB chromogen in the sample containing multiple dyes. The experiment demonstrated that the level of extracellular matrix proteins could be comparatively quantified in different groups of samples. The effective methods for the quantitative analysis of the Ki-67 labelling index were also tested using the same algorithms. The developed program was published under free GPL 3.0.

  12. Development of an optimized procedure bridging design and structural analysis codes for the automatized design of the SMART

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Park, Keun Bae; Choi, Suhn; Kim, Kang Soo; Jeong, Kyeong Hoon; Lee, Gyu Mahn

    1998-09-01

    In this report, an optimized design and analysis procedure is established to apply to the SMART (System-integrated Modular Advanced ReacTor) development. The development of an optimized procedure is to minimize the time consumption and engineering effort by squeezing the design and feedback interactions. To achieve this goal, the data and information generated through the design development should be directly transferred to the analysis program with minimum operation. The verification of the design concept requires considerable effort since the communication between the design and analysis involves time consuming stage for the conversion of input information. In this report, an optimized procedure is established bridging the design and analysis stage utilizing the IDEAS, ABAQUS and ANSYS. (author). 3 refs., 2 tabs., 5 figs

  13. Multielement X-ray radiometric analysis with application of semiconductor detectors and automatic processing of the results of measurements

    International Nuclear Information System (INIS)

    Berezkin, V.V.; Mamikonyan, S.V.; Shchekin, K.I.

    1979-01-01

    Problems of complex extraction of useful components from the ores with compound composition demand to ensure multielement analysis having the accuracy which is sufficient for practical purposes. Great possibilities has the X-ray-radiometric analysis with application of semiconductor detectors (SD) and with processing the results of measurements by means of mini- or micro-computers. Present state in the detection and computation techniques permits to introduce the said instruments into the practical use in the analytical laboratories of the mining enterprises. On the base of discussion of the practical tasks in analysis of different types of ores, in the paper basic principles of the multielement X-ray-radiometric analysis for industrial purposes have been formulated. First of all it is an installation with few channels. The main requirement in creation of such installations is to ensure high relaibility and stability of their performance. A variant is given of such analyzer, constructed with use of SiLi or Ge detecting blocks. Possibility for quick change of the excitation sources made of the set of iron-55, cadmium-109, americium-241 or cobalt-57 ensures effective excitation of elements in the range from calcium to uranium. Some practical methods of analysis have been discussed in the paper. They are based both on the methods of passive and active experiments at the calibration stages. Accuracy of these methods is enough for change of ordinary chemical analysis by the radiometric one. Problems are discussed of application of mini- and micro-computers, permitting processing of information according to the metods of analysis having been developed. Some examples are given of practical realization of the multielement X-ray-radiometric analysis of the lead-zinc, cppper-molybdenum, lead-barite and some other types of ores and also of the products of processing of ores [ru

  14. Appropriate threshold levels of cardiac beat-to-beat variation in semi-automatic analysis of equine ECG recordings

    DEFF Research Database (Denmark)

    Madsen, Mette Flethøj; Kanters, Jørgen K.; Pedersen, Philip Juul

    2016-01-01

    Background: Although premature beats are a matter of concern in horses, the interpretation of equine ECG recordings is complicated by a lack of standardized analysis criteria and a limited knowledge of the normal beat-to-beat variation of equine cardiac rhythm. The purpose of this study was to de......Background: Although premature beats are a matter of concern in horses, the interpretation of equine ECG recordings is complicated by a lack of standardized analysis criteria and a limited knowledge of the normal beat-to-beat variation of equine cardiac rhythm. The purpose of this study...... was to determine the appropriate threshold levels of maximum acceptable deviation of RR intervals in equine ECG analysis, and to evaluate a novel two-step timing algorithm by quantifying the frequency of arrhythmias in a cohort of healthy adult endurance horses. Results: Beat-to-beat variation differed......, range 1–24). Conclusions: Beat-to-beat variation of equine cardiac rhythm varies according to HR, and threshold levels in equine ECG analysis should be adjusted accordingly. Standardization of the analysis criteria will enable comparisons of studies and follow-up examinations of patients. A small number...

  15. Analysis of Chi-square Automatic Interaction Detection (CHAID) and Classification and Regression Tree (CRT) for Classification of Corn Production

    Science.gov (United States)

    Susanti, Yuliana; Zukhronah, Etik; Pratiwi, Hasih; Respatiwulan; Sri Sulistijowati, H.

    2017-11-01

    To achieve food resilience in Indonesia, food diversification by exploring potentials of local food is required. Corn is one of alternating staple food of Javanese society. For that reason, corn production needs to be improved by considering the influencing factors. CHAID and CRT are methods of data mining which can be used to classify the influencing variables. The present study seeks to dig up information on the potentials of local food availability of corn in regencies and cities in Java Island. CHAID analysis yields four classifications with accuracy of 78.8%, while CRT analysis yields seven classifications with accuracy of 79.6%.

  16. Advances in boundary elements. Vol. 1-3

    International Nuclear Information System (INIS)

    Brebbia, C.A.; Connor, J.J.

    1989-01-01

    This book contains some of the edited papers presented at the 11th Boundary Element Conference, held in Cambridge, Massachusetts, during August 1989. The papers are arranged in three different books comprising the following topics: Vol. 1: Computations and Fundamentals - comprises sections on fundamentals, adaptive techniques, error and convergence, numerical methods and computational aspects. (283 p.). Vol. 2: Field and fluid flow solutions - includes the following topics: potential problems, thermal studies, electrical and electromagnetic problems, wave propagation, acoustics and fluid flow. (484 p.). Vol. 3: Stress analysis - deals with advances in linear problems, nonlinear problems, fracture mechanics, contact mechanics, optimization, geomechanics, plates and shells, vibrations and industrial applications. (450 p). (orig./HP)

  17. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  18. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis

    Czech Academy of Sciences Publication Activity Database

    Schafer, S.; Nylund, K.; Saevik, F.; Engjom, T.; Mézl, M.; Jiřík, Radovan; Dimcevski, G.; Gilja, O.H.; Tönnies, K.

    2015-01-01

    Roč. 63, AUG 1 (2015), s. 229-237 ISSN 0010-4825 R&D Projects: GA ČR GAP102/12/2380 Institutional support: RVO:68081731 Keywords : ultrasonography * motion analysis * motion compensation * registration * CEUS * contrast-enhanced ultrasound * perfusion * perfusion modeling Subject RIV: FS - Medical Facilities ; Equipment Impact factor: 1.521, year: 2015

  19. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  20. Determination of Free and Total Sulfites in Wine using an Automatic Flow Injection Analysis System with Voltammetric Detection

    OpenAIRE

    Gonçalves, Luís Moreira; Pacheco, João Grosso; Magalhães, Paulo Jorge; Rodrigues, José António; Barros, Aquiles Araújo

    2009-01-01

    Abstract An automated Flow Injection Analysis (FIA) system based on a initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell is proposed for the determination of total and free content of sulphur dioxide (SO2) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and the simplified method commonly used by the wine industry). The developed method shown repeatability (RSD lower ...

  1. Design, Construction and Effectiveness Analysis of Hybrid Automatic Solar Tracking System for Amorphous and Crystalline Solar Cells

    OpenAIRE

    Bhupendra Gupta

    2013-01-01

    - This paper concerns the design and construction of a Hybrid solar tracking system. The constructed device was implemented by integrating it with Amorphous & Crystalline Solar Panel, three dimensional freedom mechanism and microcontroller. The amount of power available from a photovoltaic panel is determined by three parameters, the type of solar tracker, materials of solar panel and the intensity of the sunlight. The objective of this paper is to present analysis on the use of two differ...

  2. Quantitative analysis of selected minor and trace elements through use of a computerized automatic x-ray spectrograph

    International Nuclear Information System (INIS)

    Fabbi, B.P.; Elsheimer, H.N.; Espos, L.F.

    1976-01-01

    Upgrading a manual X-ray spectrograph, interfacing with an 8K computer, and employment of interelement correction programs have resulted in a several-fold increase in productivity for routine quantitative analysis and an accompanying decrease in operator bias both in measurement procedures and in calculations. Factors such as dead time and self-absorption also are now computer corrected, resulting in improved accuracy. All conditions of analysis except for the X-ray tube voltage are controlled by the computer, which enhances precision of analysis. Elemental intensities are corrected for matrix effects, and from these the percent concentrations are calculated and printed via teletype. Interelement correction programs utilizing multiple linear regression are employed for the determination of the following minor and trace elements: K, S, Rb, Sr, Y, and Zr in silicate rocks, and Ba, As, Sb, and Zn in both silicate and carbonate rock samples. The last named elements use the same regression curves for both rock types. All these elements are determined in concentrations generally ranging from 0.0025 percent to 4.00 percent. The sensitivities obtainable range from 0.0001 percent for barium to 0.001 percent for antimony. The accuracy, as measured by the percent relative error for a variety of silicate and carbonate rocks, is on the order of 1-7 percent. The exception is yttrium

  3. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  4. Automatic Verification of Serializers.

    Science.gov (United States)

    1980-03-01

    Programming Languages, Academic Press, New York, 1968. Dijkstra 71 E. Dijkstra, Hierarchical Ordering of Sequential Processes, Acta Informatica , vol. 1...Programming Languages, Las Vegas, January 1980, 174-185. Lanipson and Redell 79 B. Lanipson, 1). RedelI , Experience with monitors and processes in

  5. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    Science.gov (United States)

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design.

  6. Automatic requirements traceability

    OpenAIRE

    Andžiulytė, Justė

    2017-01-01

    This paper focuses on automatic requirements traceability and algorithms that automatically find recommendation links for requirements. The main objective of this paper is the evaluation of these algorithms and preparation of the method defining algorithms to be used in different cases. This paper presents and examines probabilistic, vector space and latent semantic indexing models of information retrieval and association rule mining using authors own implementations of these algorithms and o...

  7. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  8. Automatic screening of obstructive sleep apnea from the ECG based on empirical mode decomposition and wavelet analysis

    International Nuclear Information System (INIS)

    Mendez, M O; Cerutti, S; Bianchi, A M; Corthout, J; Van Huffel, S; Matteucci, M; Penzel, T

    2010-01-01

    This study analyses two different methods to detect obstructive sleep apnea (OSA) during sleep time based only on the ECG signal. OSA is a common sleep disorder caused by repetitive occlusions of the upper airways, which produces a characteristic pattern on the ECG. ECG features, such as the heart rate variability (HRV) and the QRS peak area, contain information suitable for making a fast, non-invasive and simple screening of sleep apnea. Fifty recordings freely available on Physionet have been included in this analysis, subdivided in a training and in a testing set. We investigated the possibility of using the recently proposed method of empirical mode decomposition (EMD) for this application, comparing the results with the ones obtained through the well-established wavelet analysis (WA). By these decomposition techniques, several features have been extracted from the ECG signal and complemented with a series of standard HRV time domain measures. The best performing feature subset, selected through a sequential feature selection (SFS) method, was used as the input of linear and quadratic discriminant classifiers. In this way we were able to classify the signals on a minute-by-minute basis as apneic or nonapneic with different best-subset sizes, obtaining an accuracy up to 89% with WA and 85% with EMD. Furthermore, 100% correct discrimination of apneic patients from normal subjects was achieved independently of the feature extractor. Finally, the same procedure was repeated by pooling features from standard HRV time domain, EMD and WA together in order to investigate if the two decomposition techniques could provide complementary features. The obtained accuracy was 89%, similarly to the one achieved using only Wavelet analysis as the feature extractor; however, some complementary features in EMD and WA are evident

  9. A rapid automatic processing platform for bead label-assisted microarray analysis: application for genetic hearing-loss mutation detection.

    Science.gov (United States)

    Zhu, Jiang; Song, Xiumei; Xiang, Guangxin; Feng, Zhengde; Guo, Hongju; Mei, Danyang; Zhang, Guohao; Wang, Dong; Mitchelson, Keith; Xing, Wanli; Cheng, Jing

    2014-04-01

    Molecular diagnostics using microarrays are increasingly being used in clinical diagnosis because of their high throughput, sensitivity, and accuracy. However, standard microarray processing takes several hours and involves manual steps during hybridization, slide clean up, and imaging. Here we describe the development of an integrated platform that automates these individual steps as well as significantly shortens the processing time and improves reproducibility. The platform integrates such key elements as a microfluidic chip, flow control system, temperature control system, imaging system, and automated analysis of clinical results. Bead labeling of microarray signals required a simple imaging system and allowed continuous monitoring of the microarray processing. To demonstrate utility, the automated platform was used to genotype hereditary hearing-loss gene mutations. Compared with conventional microarray processing procedures, the platform increases the efficiency and reproducibility of hybridization, speeding microarray processing through to result analysis. The platform also continuously monitors the microarray signals, which can be used to facilitate optimization of microarray processing conditions. In addition, the modular design of the platform lends itself to development of simultaneous processing of multiple microfluidic chips. We believe the novel features of the platform will benefit its use in clinical settings in which fast, low-complexity molecular genetic testing is required.

  10. Automatic Classification of Users' Health Information Need Context: Logistic Regression Analysis of Mouse-Click and Eye-Tracker Data.

    Science.gov (United States)

    Pian, Wenjing; Khoo, Christopher Sg; Chi, Jianxing

    2017-12-21

    Users searching for health information on the Internet may be searching for their own health issue, searching for someone else's health issue, or browsing with no particular health issue in mind. Previous research has found that these three categories of users focus on different types of health information. However, most health information websites provide static content for all users. If the three types of user health information need contexts can be identified by the Web application, the search results or information offered to the user can be customized to increase its relevance or usefulness to the user. The aim of this study was to investigate the possibility of identifying the three user health information contexts (searching for self, searching for others, or browsing with no particular health issue in mind) using just hyperlink clicking behavior; using eye-tracking information; and using a combination of eye-tracking, demographic, and urgency information. Predictive models are developed using multinomial logistic regression. A total of 74 participants (39 females and 35 males) who were mainly staff and students of a university were asked to browse a health discussion forum, Healthboards.com. An eye tracker recorded their examining (eye fixation) and skimming (quick eye movement) behaviors on 2 types of screens: summary result screen displaying a list of post headers, and detailed post screen. The following three types of predictive models were developed using logistic regression analysis: model 1 used only the time spent in scanning the summary result screen and reading the detailed post screen, which can be determined from the user's mouse clicks; model 2 used the examining and skimming durations on each screen, recorded by an eye tracker; and model 3 added user demographic and urgency information to model 2. An analysis of variance (ANOVA) analysis found that users' browsing durations were significantly different for the three health information contexts

  11. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    Science.gov (United States)

    Bainbridge, Matthew B.; Webb, John K.

    2017-06-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  12. An Automatic Target Detection Algorithm for Swath Sonar Backscatter Imagery, Using Image Texture and Independent Component Analysis

    Directory of Open Access Journals (Sweden)

    Elias Fakiris

    2016-04-01

    Full Text Available In the present paper, a methodological scheme, bringing together common Acoustic Seabed Classification (ASC systems and a powerful data decomposition approach, called Independent Component Analysis (ICA, is demonstrated regarding its suitability for detecting small targets in Side Scan Sonar imagery. Traditional ASC systems extract numerous texture descriptors, leading to a large feature vector, the dimensionality of which is reduced by means of data decomposition techniques, usually Principal Component Analysis (PCA, prior to classification. However, in the target detection issue, data decomposition should point towards finding components that represent sub-ordinary image information (i.e., small targets rather than a dominant one. ICA has long been proved to be suitable for separating targets from a background, and this study represents a novel exhibition of its applicability to Side Scan Sonar (SSS images. The present study attempts to build a fully automated target detection approach that combines image based feature extraction, ICA, and unsupervised classification. The suitability of the proposed approach has been demonstrated using an SSS data-set containing more than 70 manmade targets, most of them metallic, validated through a marine magnetic survey or ground truthing inspection. The method exhibited very good performance as it was able to detect more than 77% of the targets and it produced less than seven false alarms per km2. Moreover, it was compared to cases where, in the exact same methodological scheme, no decomposition technique is used, or PCA is employed instead of ICA, achieving the highest detection rate, but, more importantly, producing more than six times less false alarms, thus proving that ICA successfully manages to maximize target to background separation.

  13. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images.

    Science.gov (United States)

    Barbosa, Daniel C; Roupar, Dalila B; Ramos, Jaime C; Tavares, Adriano C; Lima, Carlos S

    2012-01-11

    Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.

  14. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images

    Directory of Open Access Journals (Sweden)

    Barbosa Daniel C

    2012-01-01

    Full Text Available Abstract Background Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. Method The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. Results The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.

  15. Analysis of Inter-moss Loops in the Solar Transition Region with IRIS and SDO/AIA: Automatic Event Detection and Characterization

    Science.gov (United States)

    Fayock, B.; Winebarger, A. R.; De Pontieu, B.

    2014-12-01

    The transition region of the solar atmosphere is no longer believed to be exclusively a thin boundary layer connecting the chromosphere and the corona. Instead, the emission from this region is dominated by dynamic, low-lying loops with peak temperatures data due to the transition region spectral lines in the AIA passbands, but have not been studied with great detail. The IRIS instrument has resolved these loops both spatially and temporally. With an IRIS image cadence of approximately 10 seconds, we are able to study the evolution of these loops. We have developed a technique to automatically identify events (i.e., brightenings) on a pixel-by-pixel basis applying a set of selection criteria. The pixels are then grouped according to their proximity in space and relative progression of the event. This method allows us to characterize their overall lifetime and the rate at which these events occur. Our current progress includes identification of these groups of events in IRIS data, determination of their existence in AIA data, and characterization based on a comparison between the two. If the same events appear in both IRIS and AIA data, it may suggest that the intrinsic transition region is not in local thermodynamic equilibrium. We present the results that follow each integral step in the analysis and provide a preliminary characterization of a few example events within our data set.

  16. Automatic flow analysis method to determine traces of Mn²⁺ in sea and drinking waters by a kinetic catalytic process using LWCC-spectrophotometric detection.

    Science.gov (United States)

    Chaparro, Laura; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2016-02-01

    A new automatic kinetic catalytic method has been developed for the measurement of Mn(2+) in drinking and seawater samples. The method is based on the catalytic effect of Mn(2+) on the oxidation of tiron by hydrogen peroxide in presence of Pb(2+) as an activator. The optimum conditions were obtained at pH 10 with 0.019 mol L(-1) 2'2 bipyridyl, 0.005 mol L(-1) tiron and 0.38 mol L(-1) hydrogen peroxide. Flow system is based on multisyringe flow injection analysis (MSFIA) coupled with a lab-on-valve (LOV) device exploiting on line spectrophotometric detection by a Liquid Waveguide Capillary Cell (LWCC), 1m optical length and performed at 445 nm. Under the optimized conditions by a multivariate approach, the method allowed the measurement of Mn(2+) in a range of 0.03-35 µg L(-1) with a detection limit of 0.010 µg L(-1), attaining a repeatability of 1.4% RSD. The method was satisfactorily applied to the determination of Mn(2+) in environmental water samples. The reliability of method was also verified by determining the manganese content of the certified standard reference seawater sample, CASS-4. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  18. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    Science.gov (United States)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  19. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    Science.gov (United States)

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness.

  20. Recommended number of strides for automatic assessment of gait symmetry and regularity in above-knee amputees by means of accelerometry and autocorrelation analysis

    Directory of Open Access Journals (Sweden)

    Tura Andrea

    2012-02-01

    Full Text Available Abstract Background Symmetry and regularity of gait are essential outcomes of gait retraining programs, especially in lower-limb amputees. This study aims presenting an algorithm to automatically compute symmetry and regularity indices, and assessing the minimum number of strides for appropriate evaluation of gait symmetry and regularity through autocorrelation of acceleration signals. Methods Ten transfemoral amputees (AMP and ten control subjects (CTRL were studied. Subjects wore an accelerometer and were asked to walk for 70 m at their natural speed (twice. Reference values of step and stride regularity indices (Ad1 and Ad2 were obtained by autocorrelation analysis of the vertical and antero-posterior acceleration signals, excluding initial and final strides. The Ad1 and Ad2 coefficients were then computed at different stages by analyzing increasing portions of the signals (considering both the signals cleaned by initial and final strides, and the whole signals. At each stage, the difference between Ad1 and Ad2 values and the corresponding reference values were compared with the minimum detectable difference, MDD, of the index. If that difference was less than MDD, it was assumed that the portion of signal used in the analysis was of sufficient length to allow reliable estimation of the autocorrelation coefficient. Results All Ad1 and Ad2 indices were lower in AMP than in CTRL (P Conclusions Without the need to identify and eliminate the phases of gait initiation and termination, twenty strides can provide a reasonable amount of information to reliably estimate gait regularity in transfemoral amputees.

  1. Automatic extraction of the cingulum bundle in diffusion tensor tract-specific analysis. Feasibility study in Parkinson's disease with and without dementia

    International Nuclear Information System (INIS)

    Ito, Kenji; Masutani, Yoshitaka; Suzuki, Yuichi; Ino, Kenji; Kunimatsu, Akira; Ohtomo, Kuni; Kamagata, Koji; Yasmin, Hasina; Aoki, Shigeki

    2013-01-01

    Tract-specific analysis (TSA) measures diffusion parameters along a specific fiber that has been extracted by fiber tracking using manual regions of interest (ROIs), but TSA is limited by its requirement for manual operation, poor reproducibility, and high time consumption. We aimed to develop a fully automated extraction method for the cingulum bundle (CB) and to apply the method to TSA in neurobehavioral disorders such as Parkinson's disease (PD). We introduce the voxel classification (VC) and auto diffusion tensor fiber-tracking (AFT) methods of extraction. The VC method directly extracts the CB, skipping the fiber-tracking step, whereas the AFT method uses fiber tracking from automatically selected ROIs. We compared the results of VC and AFT to those obtained by manual diffusion tensor fiber tracking (MFT) performed by 3 operators. We quantified the Jaccard similarity index among the 3 methods in data from 20 subjects (10 normal controls [NC] and 10 patients with Parkinson's disease dementia [PDD]). We used all 3 extraction methods (VC, AFT, and MFT) to calculate the fractional anisotropy (FA) values of the anterior and posterior CB for 15 NC subjects, 15 with PD, and 15 with PDD. The Jaccard index between results of AFT and MFT, 0.72, was similar to the inter-operator Jaccard index of MFT. However, the Jaccard indices between VC and MFT and between VC and AFT were lower. Consequently, the VC method classified among 3 different groups (NC, PD, and PDD), whereas the others classified only 2 different groups (NC, PD or PDD). For TSA in Parkinson's disease, the VC method can be more useful than the AFT and MFT methods for extracting the CB. In addition, the results of patient data analysis suggest that a reduction of FA in the posterior CB may represent a useful biological index for monitoring PD and PDD. (author)

  2. [The effects of rumination on automatic thoughts and depressive symptoms].

    Science.gov (United States)

    Nishikawa, Daiji; Matsunaga, Miki; Furutani, Kaichiro

    2013-12-01

    This study investigated the effects of rumination (reflective pondering and brooding) on automatic thoughts (both negative and positive) and depressive symptoms. University students (N=183; 96 men) completed the Self-Rating Depression Scale (SDS), Automatic Thoughts Questionnaire-Revised (ATQ-R), and Response Style Scale (RSS). We conducted a path analysis which included gender as a factor. The results revealed that brooding was associated with negative automatic thoughts. Negative automatic thoughts contributed to the aggravation of depressive symptoms. In contrast, reflective pondering was associated with positive automatic thoughts. Positive automatic thoughts contributed to the reduction of depressive symptoms. These results indicate that rumination does not affect depressive symptoms directly. We suggest that rumination affects depressive symptoms indirectly through automatic thoughts, and that there are gender differences in the influence process.

  3. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  4. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...

  5. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    NXT tool for annotating dialogue acts in a multiparty conversation. designer, marketing expert and interface designer), and the team participated in a series of four meetings. The meetings took place over 3–4 hours: about half of this time was spent in meetings, the remainder was spent in preparation, with each participant ...

  6. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    vides a way to evaluate the quality of a summary compared with multiple reference summaries, and has become widely used in text summarization, agreeing well with subjective evaluations. However, ROUGE does not correlate well with human judgements for meeting summarization. (Liu & Liu 2008). For better evaluation ...

  7. Proceedings of the Seventh Conference of Nuclear Sciences and Applications. Vol.1,2,3

    International Nuclear Information System (INIS)

    Aly, H.F.

    2000-01-01

    The publication has been set up as a textbook for nuclear sciences and applications vol.1: (1) radiochemistry; (2) radiation chemistry; (3) isotope production; (4) waste management; vol.2: (1) nuclear and reactor; (2) physics; (3) plasma physics; (4) instrumentation and devices; (5) trace and ultra trace analysis; (6) environmental; vol.3: (1) radiation protection; (2) radiation health hazards; (3) nuclear safety; (4) biology; (5) agriculture

  8. Automatic Smoker Detection from Telephone Speech Signals

    DEFF Research Database (Denmark)

    Alavijeh, Amir Hossein Poorjam; Hesaraki, Soheila; Safavi, Saeid

    2017-01-01

    This paper proposes an automatic smoking habit detection from spontaneous telephone speech signals. In this method, each utterance is modeled using i-vector and non-negative factor analysis (NFA) frameworks, which yield low-dimensional representation of utterances by applying factor analysis on G...

  9. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR, type reactor

    International Nuclear Information System (INIS)

    Lefevre, F.; Baumaire, A.; Comby, R.; Benas, J.C.

    1990-01-01

    The automatization of the monitoring of the steam generator tubes required some developments in the field of data processing. The monitoring is performed by means of Eddy current tests. Improvements in signal processing and in pattern recognition associated to the artificial intelligence techniques induced EDF (French Electricity Company) to develop an automatic signal processing system. The system, named EXTRACSION (French acronym for Expert System for the Processing and classification of Signals of Nuclear Nature), insures the coherence between the different fields of knowledge (metallurgy, measurement, signals) during data processing by applying an object oriented representation [fr

  10. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  11. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  12. Modification of Hewlett-Packard Chemstation (G1034C version C.02.00 and G1701AA version C.02.00 and C.03.00) data analysis program for addition of automatic extracted ion chromatographic groups.

    Science.gov (United States)

    Amick, G D

    1998-01-01

    A modification to data analysis macros to create "ion groups" and add a menu item to the data analysis menu bar is described. These "ion groups" consist of up to 10 ions for extracted ion chromatographs. The present manuscript describes modifications to yield a drop down menu in data analysis that contains user-defined names of groups (e.g., opiates, barbiturates, benzodiazepines) of ions that, when selected, will automatically perform extracted ion chromatographs with up to 10 ions in that group for the loaded datafile.

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his ...... a renewed stimulus for continuing and deepening Bob's research visions. A familiar touch is given to the book by some pictures kindly provided to us by his wife Nieba, the personal recollections of his brother Gary and some of his colleagues and friends....... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  15. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological...... aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  16. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    International Nuclear Information System (INIS)

    Pescarini, M.; Orsi, R.; Martinelli, T.

    2003-01-01

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  17. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  18. Automatic Language Identification

    Science.gov (United States)

    2000-08-01

    hundreds guish one language from another. The reader is referred of input languages would need to be supported , the cost of to the linguistics literature...eventually obtained bet- 108 TRAINING FRENCH GERMAN ITRAIING FRENCH M- ALGORITHM - __ GERMAN NHSPANISH TRAINING SPEECH SET OF MODELS: UTTERANCES ONE MODEL...i.e. vowels ) for each speech utterance are located malized to be insensitive to overall amplitude, pitch and automatically. Next, feature vectors

  19. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  20. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  1. Upgrade of the automatic analysis system in the TJ-II Thomson Scattering diagnostic: New image recognition classifier and fault condition detection

    International Nuclear Information System (INIS)

    Makili, L.; Vega, J.; Dormido-Canto, S.; Pastor, I.; Pereira, A.; Farias, G.; Portas, A.; Perez-Risco, D.; Rodriguez-Fernandez, M.C.; Busch, P.

    2010-01-01

    An automatic image classification system based on support vector machines (SVM) has been in operation for years in the TJ-II Thomson Scattering diagnostic. It recognizes five different types of images: CCD camera background, measurement of stray light without plasma or in a collapsed discharge, image during ECH phase, image during NBI phase and image after reaching the cut off density during ECH heating. Each kind of image implies the execution of different application software. Due to the fact that the recognition system is based on a learning system and major modifications have been carried out in both the diagnostic (optics) and TJ-II plasmas (injected power), the classifier model is no longer valid. A new SVM model has been developed with the current conditions. Also, specific error conditions in the data acquisition process can automatically be detected and managed now. The recovering process has been automated, thereby avoiding the loss of data in ensuing discharges.

  2. Automatic detection of particle size distribution by image analysis based on local adaptive canny edge detection and modified circular Hough transform.

    Science.gov (United States)

    Meng, Yingchao; Zhang, Zhongping; Yin, Huaqiang; Ma, Tao

    2018-03-01

    To obtain size distribution of nanoparticles, scanning electron microscope (SEM) and transmission electron microscopy (TEM) have been widely adopted, but manual measurement of statistical size distributions from the SEM or TEM images is time-consuming and labor-intensive. Therefore, automatic detection methods are desirable. This paper proposes an automatic image processing algorithm which is mainly based on local adaptive Canny edge detection and modified circular Hough transform. The proposed algorithm can utilize the local thresholds to detect particles from the images with different degrees of complexity. Compared with the results produced by applying global thresholds, our algorithm performs much better. The robustness and reliability of this method have been verified by comparing its results with manual measurement, and an excellent agreement has been found. The proposed method can accurately recognize the particles with high efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J. S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J. P. K.; Geertzen, J. H. B.

    2004-01-01

    This paper describes a new automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitted

  4. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients,

  5. Automatic Estimation of Movement Statistics of People

    DEFF Research Database (Denmark)

    Ægidiussen Jensen, Thomas; Rasmussen, Henrik Anker; Moeslund, Thomas B.

    2012-01-01

    Automatic analysis of how people move about in a particular environment has a number of potential applications. However, no system has so far been able to do detection and tracking robustly. Instead, trajectories are often broken into tracklets. The key idea behind this paper is based around...

  6. AUTOMATIC IDENTIFICATION OF ITEMS IN WAREHOUSE MANAGEMENT

    OpenAIRE

    Vladimír Modrák; Peter Knuth

    2010-01-01

    Automatic identification of items saves time and is beneficial in various areas, including warehouse management. Identification can be done by many technologies, but RFID technology seems to be one of the smartest solutions. This article deals with testing and possible use of RFID technology in warehouse management. All results and measurement outcomes are documented in form of graphs followed by comprehensive analysis.

  7. Analysis of the High-Frequency Content in Human QRS Complexes by the Continuous Wavelet Transform: An Automatized Analysis for the Prediction of Sudden Cardiac Death.

    Science.gov (United States)

    García Iglesias, Daniel; Roqueñi Gutiérrez, Nieves; De Cos, Francisco Javier; Calvo, David

    2018-02-12

    Fragmentation and delayed potentials in the QRS signal of patients have been postulated as risk markers for Sudden Cardiac Death (SCD). The analysis of the high-frequency spectral content may be useful for quantification. Forty-two consecutive patients with prior history of SCD or malignant arrhythmias (patients) where compared with 120 healthy individuals (controls). The QRS complexes were extracted with a modified Pan-Tompkins algorithm and processed with the Continuous Wavelet Transform to analyze the high-frequency content (85-130 Hz). Overall, the power of the high-frequency content was higher in patients compared with controls (170.9 vs. 47.3 10³nV²Hz -1 ; p = 0.007), with a prolonged time to reach the maximal power (68.9 vs. 64.8 ms; p = 0.002). An analysis of the signal intensity (instantaneous average of cumulative power), revealed a distinct function between patients and controls. The total intensity was higher in patients compared with controls (137.1 vs. 39 10³nV²Hz -1 s -1 ; p = 0.001) and the time to reach the maximal intensity was also prolonged (88.7 vs. 82.1 ms; p content of the QRS complexes was distinct between patients at risk of SCD and healthy controls. The wavelet transform is an efficient tool for spectral analysis of the QRS complexes that may contribute to stratification of risk.

  8. TU-FG-201-03: Automatic Pre-Delivery Verification Using Statistical Analysis of Consistencies in Treatment Plan Parameters by the Treatment Site and Modality

    International Nuclear Information System (INIS)

    Liu, S; Wu, Y; Chang, X; Li, H; Yang, D

    2016-01-01

    Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans of the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the

  9. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  10. Analysis of the High-Frequency Content in Human QRS Complexes by the Continuous Wavelet Transform: An Automatized Analysis for the Prediction of Sudden Cardiac Death

    Directory of Open Access Journals (Sweden)

    Daniel García Iglesias

    2018-02-01

    Full Text Available Background: Fragmentation and delayed potentials in the QRS signal of patients have been postulated as risk markers for Sudden Cardiac Death (SCD. The analysis of the high-frequency spectral content may be useful for quantification. Methods: Forty-two consecutive patients with prior history of SCD or malignant arrhythmias (patients where compared with 120 healthy individuals (controls. The QRS complexes were extracted with a modified Pan-Tompkins algorithm and processed with the Continuous Wavelet Transform to analyze the high-frequency content (85–130 Hz. Results: Overall, the power of the high-frequency content was higher in patients compared with controls (170.9 vs. 47.3 103nV2Hz−1; p = 0.007, with a prolonged time to reach the maximal power (68.9 vs. 64.8 ms; p = 0.002. An analysis of the signal intensity (instantaneous average of cumulative power, revealed a distinct function between patients and controls. The total intensity was higher in patients compared with controls (137.1 vs. 39 103nV2Hz−1s−1; p = 0.001 and the time to reach the maximal intensity was also prolonged (88.7 vs. 82.1 ms; p < 0.001. Discussion: The high-frequency content of the QRS complexes was distinct between patients at risk of SCD and healthy controls. The wavelet transform is an efficient tool for spectral analysis of the QRS complexes that may contribute to stratification of risk.

  11. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    Science.gov (United States)

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  12. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  13. Automatic readout micrometer

    Science.gov (United States)

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  14. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  15. Conflicts versus analytical redundancy relations: a comparative analysis of the model based diagnosis approach from the artificial intelligence and automatic control perspectives.

    Science.gov (United States)

    Cordier, Marie-Odile; Dague, Philippe; Lévy, François; Montmain, Jacky; Staroswiecki, Marcel; Travé-Massuyès, Louise

    2004-10-01

    Two distinct and parallel research communities have been working along the lines of the model-based diagnosis approach: the fault detection and isolation (FDI) community and the diagnostic (DX) community that have evolved in the fields of automatic control and artificial intelligence, respectively. This paper clarifies and links the concepts and assumptions that underlie the FDI analytical redundancy approach and the DX consistency-based logical approach. A formal framework is proposed in order to compare the two approaches and the theoretical proof of their equivalence together with the necessary and sufficient conditions is provided.

  16. On-line dynamic fractionation and automatic determination of inorganic phosphorous in environmental solid substrates exploiting sequential injection microcolumn extraction and flow injection analysi

    DEFF Research Database (Denmark)

    Buanuam, Janya; Miró, Manuel; Hansen, Elo Harald

    2006-01-01

    associations for phosphorus, that is, exchangeable, Al- and Fe-bound and Ca-bound fractions, were elucidated by accommodation in the flow manifold of the 3 steps of the Hietjles-Litjkema (HL) scheme involving the use of 1.0 M NH4Cl, 0.1 M NaOH and 0.5 M HCl, respectively, as sequential leaching reagents....... The precise timing and versatility of SI for tailoring various operational extraction modes were utilised for investigating the extractability and extent of phosphorous re-distribution for variable partitioning times. Automatic spectrophotometric determination of soluble reactive phosphorous in soil extracts...

  17. MTA 1527-2000: A fast automatic analyzer for the analysis of mineral raw materials and products of the silicate industry

    International Nuclear Information System (INIS)

    Horvath, H.; Renner, J.; Siklos, A.

    1982-01-01

    A new automatic analytical system designed and constructed for fast measurements on the spot is described. The system can be applied in cars, ships or galleries of mines. It consists of two independent analyzer systems: neutron activation and X-ray fluorescence analyzers. The main purpose of the design was to determine Al, Si, Ca and Fe oxides in bauxite, alumina, clays and in products of the silicate industry, mainly in cements. The measuring and data evaluation processes are fully automated. It can be used for the continuous monitoring and control of cement production. (D.Gy.)

  18. Reachability Games on Automatic Graphs

    Science.gov (United States)

    Neider, Daniel

    In this work we study two-person reachability games on finite and infinite automatic graphs. For the finite case we empirically show that automatic game encodings are competitive to well-known symbolic techniques such as BDDs, SAT and QBF formulas. For the infinite case we present a novel algorithm utilizing algorithmic learning techniques, which allows to solve huge classes of automatic reachability games.

  19. Automatic reactor protection system tester

    International Nuclear Information System (INIS)

    Deliant, J.D.; Jahnke, S.; Raimondo, E.

    1988-01-01

    The object of this paper is to present the automatic tester of reactor protection systems designed and developed by EDF and Framatome. In order, the following points are discussed: . The necessity for reactor protection system testing, . The drawbacks of manual testing, . The description and use of the Framatome automatic tester, . On-site installation of this system, . The positive results obtained using the Framatome automatic tester in France

  20. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  1. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis.

    Science.gov (United States)

    Toledo, Cíntia Matsuda; Cunha, Andre; Scarton, Carolina; Aluísio, Sandra

    2014-01-01

    Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario. The aims were to describe how to:(i) develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and(ii) automatically identify the features that best distinguish the groups. The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age). In this study, the descriptions by 144 of the subjects studied in Toledo 18 were used,which included 200 healthy Brazilians of both genders. A Support Vector Machine (SVM) with a radial basis function (RBF) kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS) is a strong candidate to replace manual feature selection methods.

  2. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis

    Directory of Open Access Journals (Sweden)

    Cíntia Matsuda Toledo

    Full Text Available Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario.OBJECTIVE: The aims were to describe how to: (i develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and (ii automatically identify the features that best distinguish the groups.METHODS: The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age. In this study, the descriptions by 144 of the subjects studied in Toledo18 were used, which included 200 healthy Brazilians of both genders.RESULTS AND CONCLUSION:A Support Vector Machine (SVM with a radial basis function (RBF kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS is a strong candidate to replace manual feature selection methods.

  3. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  4. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification.

    Science.gov (United States)

    1978-12-01

    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  5. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user.

  6. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  7. Automatic extraction of left ventricle in SPECT myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Liu Li; Zhao Shujun; Yao Zhiming; Wang Daoyu

    1999-01-01

    An automatic method of extracting left ventricle from SPECT myocardial perfusion data was introduced. This method was based on the least square analysis of the positions of all short-axis slices pixels from the half sphere-cylinder myocardial model, and used a iterative reconstruction technique to automatically cut off the non-left ventricular tissue from the perfusion images. Thereby, this technique provided the bases for further quantitative analysis

  8. Automatic TLI recognition system, general description

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  9. Classifying visemes for automatic lipreading

    NARCIS (Netherlands)

    Visser, Michiel; Poel, Mannes; Nijholt, Antinus; Matousek, Vaclav; Mautner, Pavel; Ocelikovi, Jana; Sojka, Petr

    1999-01-01

    Automatic lipreading is automatic speech recognition that uses only visual information. The relevant data in a video signal is isolated and features are extracted from it. From a sequence of feature vectors, where every vector represents one video image, a sequence of higher level semantic elements

  10. A comparative analysis of DBSCAN, K-means, and quadratic variation algorithms for automatic identification of swallows from swallowing accelerometry signals.

    Science.gov (United States)

    Dudik, Joshua M; Kurosu, Atsuko; Coyle, James L; Sejdić, Ervin

    2015-04-01

    Cervical auscultation with high resolution sensors is currently under consideration as a method of automatically screening for specific swallowing abnormalities. To be clinically useful without human involvement, any devices based on cervical auscultation should be able to detect specified swallowing events in an automatic manner. In this paper, we comparatively analyze the density-based spatial clustering of applications with noise algorithm (DBSCAN), a k-means based algorithm, and an algorithm based on quadratic variation as methods of differentiating periods of swallowing activity from periods of time without swallows. These algorithms utilized swallowing vibration data exclusively and compared the results to a gold standard measure of swallowing duration. Data was collected from 23 subjects that were actively suffering from swallowing difficulties. Comparing the performance of the DBSCAN algorithm with a proven segmentation algorithm that utilizes k-means clustering demonstrated that the DBSCAN algorithm had a higher sensitivity and correctly segmented more swallows. Comparing its performance with a threshold-based algorithm that utilized the quadratic variation of the signal showed that the DBSCAN algorithm offered no direct increase in performance. However, it offered several other benefits including a faster run time and more consistent performance between patients. All algorithms showed noticeable differentiation from the endpoints provided by a videofluoroscopy examination as well as reduced sensitivity. In summary, we showed that the DBSCAN algorithm is a viable method for detecting the occurrence of a swallowing event using cervical auscultation signals, but significant work must be done to improve its performance before it can be implemented in an unsupervised manner. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    Science.gov (United States)

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. IJAAAR vol 4 2007

    African Journals Online (AJOL)

    2011

    Impact of Non-governmental Organizations (NGOs) on. Rural Poverty Alleviation in ... The incidence, depth and severity of poverty of rural people and influencing rural poverty were investigated in the ..... This is equivalent to $386.84 at an average exchange rate of N141.60 to $1 at the Parallel Market. Source: Data analysis ...

  13. Agrosearch 1 Vol. 1

    African Journals Online (AJOL)

    are kept outside of the formal banking system and are often lent to group members to finance emergencies and other expenses. .... 2.4599. 1. Coefficients obtained from multiple regression analysis. Model. Unstandardized coefficients. Standard coefficients t sig. $. Std error. Beta. (Constant). 7.107. 2.272. 3.128 .002. Age.

  14. IJAAAR vol 4 2007

    African Journals Online (AJOL)

    2011

    produce marketing that engage rural women, who form the majority of the vulnerable ..... ceremonies like naming, marriage, funeral and special thanksgiving. The mean socio- .... 10.98. 7.33. † This is equivalent to $386.84 at an average exchange rate of N141.60 to $1 at the Parallel Market. Source: Data analysis, 2004. 8.

  15. IJAAAR 2011 VOL 7

    African Journals Online (AJOL)

    Faculty of Agricultural Sciences Lautech Ogbomoso

    In improving indigenous sesame seed yield, there is reliability in selecting for number of capsule and seed per capsule as these traits recorded highest selection index using heritability and genetic advance parameters. Key words: Variation, selection index, indigenous sesame, year effect, path analysis. Introduction.

  16. Automatic Classification of Attacks on IP Telephony

    Directory of Open Access Journals (Sweden)

    Jakub Safarik

    2013-01-01

    Full Text Available This article proposes an algorithm for automatic analysis of attack data in IP telephony network with a neural network. Data for the analysis is gathered from variable monitoring application running in the network. These monitoring systems are a typical part of nowadays network. Information from them is usually used after attack. It is possible to use an automatic classification of IP telephony attacks for nearly real-time classification and counter attack or mitigation of potential attacks. The classification use proposed neural network, and the article covers design of a neural network and its practical implementation. It contains also methods for neural network learning and data gathering functions from honeypot application.

  17. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Hj Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2000-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replace by the computer. In this paper a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such as area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results

  18. An automatic versatile system integrating solid-phase extraction with ultra-high performance liquid chromatography-tandem mass spectrometry using a dual-dilution strategy for direct analysis of auxins in plant extracts.

    Science.gov (United States)

    Zhong, Qisheng; Qiu, Xiongxiong; Lin, Caiyong; Shen, Lingling; Huo, Yin; Zhan, Song; Yao, Jinting; Huang, Taohong; Kawano, Shin-ichi; Hashi, Yuki; Xiao, Langtao; Zhou, Ting

    2014-09-12

    An automatic versatile system which integrated solid phase extraction (SPE) with ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) was developed. Diverse commercial SPE columns can be used under an ambient pressure in this online system realized by a dual-dilution strategy. The first dilution enabled the direct injection of complex samples with minimal pretreatment, and the second dilution realized direct introduction of large volume of strong eluent into the UHPLC column without causing peak broadening or distortion. In addition, a post-column compensation mode was also designed for the matrix-effects evaluation. The features of the online system were systematically investigated, including the dilution effect, the capture of desorption solution, the column-head stacking effect and the system recovery. Compared with the offline UHPLC system, this online system showed significant advantages such as larger injection volume, higher sensitivity, shorter analysis time and better repeatability. The feasibility of the system was demonstrated by the direct analysis of three auxins from different plant tissues, including leaves of Dracaena sanderiana, buds and petals of Bauhinia. Under the optimized conditions, the whole analysis procedure took only 7min. All the correlation coefficients were greater than 0.9987, the limits of detection and the limits of quantitation were in the range of 0.560-0.800ng/g and 1.80-2.60ng/g, respectively. The recoveries of the real samples ranged from 61.0 to 117%. Finally, the post-column compensation mode was applied and no matrix-effects were observed under the analysis conditions. The automatic versatile system was rapid, sensitive and reliable. We expect this system could be extended to other target analytes in complex samples utilizing diverse SPE columns. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Automatic exposure for xeromammography

    International Nuclear Information System (INIS)

    Aichinger, H.

    1977-01-01

    During mammography without intensifying screens, exposure measurements are carried out behind the film. It is, however, difficult to construct an absolutely shadow-free ionization chamber of adequate sensitivity working in the necessary range of 25 to 50 kV. Repeated attempts have been made to utilize the advantages of automatic exposure for xero-mammography. In this case also the ionization chamber was placed behind the Xerox plate. Depending on tube filtration, object thickness and tube voltage, more than 80%, sometimes even 90%, of the radiation is absorbed by the Xerox plate. Particularly the characteristic Mo radiation of 17.4 keV and 19.6 keV is almost totally absorbed by the plate and cannot therefore be registered by the ionization chamber. This results in a considerable dependence of the exposure on kV and object thickness. Dependence on tube voltage and object thickness have been examined dosimetrically and spectroscopically with a Ge(Li)-spectrometer. Finally, the successful use of a shadow-free chamber is described; this has been particularly adapted for xero-mammography and is placed in front of the plate. (orig) [de

  20. Historical Review and Perspective on Automatic Journalizing

    OpenAIRE

    Kato, Masaki

    2017-01-01

    ContentsIntroduction1. EDP Accounting and Automatic Journalizing2. Learning System of Automatic Journalizing3. Automatic Journalizing by the Artificial Intelligence4. Direction of the Progress of the Accounting Information System

  1. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  2. Morphological criteria of feminine upper eyelashes, quantified by a new semi-automatized image analysis: Application to the objective assessment of mascaras.

    Science.gov (United States)

    Shaiek, A; Flament, F; François, G; Vicic, M; Cointereau-Chardron, S; Curval, E; Canevet-Zaida, S; Coubard, O; Idelcaid, Y

    2018-02-01

    The wide diversity of feminine eyelashes in shape, length, and curvature makes it a complex domain that remains to be quantified in vivo, together with their changes brought by application of mascaras that are visually assessed by women themselves or make-up experts. A dedicated software was developed to semi-automatically extract and quantify, from digital images (frontal and lateral pictures), the major parameters of feminine eyelashes of Mexican and Caucasian women and to record the changes brought by the applications of various mascaras and their brushes, being self or professionally applied. The diversity of feminine eyelashes appears as a major influencing factor in the application of mascaras and their related results. Eight marketed mascaras and their respective brushes were tested and their quantitative profiles, in terms of coverage, morphology, or curvature were assessed. Standard applications by trained aestheticians led to higher and more homogeneous deposits of mascara, as compared to those resulting from self-applications. The developed software appears a precious tool for both quantifying the major characteristics of eyelashes and assessing the making-up results brought by mascaras and their associated brushes. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Analysis of the relationship of automatically and manually extracted lineaments from DEM and geologically mapped tectonic faults around the Main Ethiopian Rift and the Ethiopian Highlands, Ethiopia

    Directory of Open Access Journals (Sweden)

    Michal Kusák

    2017-02-01

    Full Text Available The paper deals with the functions that automatically extract lineaments from the 90 m Shuttle Radar Topographic Mission (SRTM of Digital Elevation Model (DEM (Consortium for Spatial Information 2014 in the software ArcGIS 10.1 and PCI Geomatica. They were performed for the Main Ethiopian Rift and the Ethiopian Highlands (transregional scale 1,060,000 km2, which are one of the tectonically most active areas in the world. The values of input parameters – the RADI (filter radius value, GTHR (edge gradient threshold, LTHR (curve length, FTHR (line fitting error, ATHR (angular difference, and the DTHR (linked distance threshold – and their influence on the final shape and number of lineaments are discussed. A map of automated extracted lineaments was created and compared with 1 the tectonic faults on the geological map by Geological Survey of Ethiopia (Mangesha et al. 1996 and 2 the lineaments based on visual interpretation by the author from the same data set. The predominant azimuth of lineaments is similar to the azimuth of the faults on the geological map. The comparison of lineaments by automated visualization in GIS and visual interpretation of lineaments carried out by the authors around the Jemma River Basin (regional scale 16,000 km2 proved that both sets of lineaments are of the same NE–SW azimuth, which is the orientation of the rift. However, lineaments mapping by automated visualization in GIS identifies a larger number of shorter lineaments than lineaments created by visual interpretation.

  4. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  5. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  6. Automatic segmentation of the colon

    Science.gov (United States)

    Wyatt, Christopher L.; Ge, Yaorong; Vining, David J.

    1999-05-01

    Virtual colonoscopy is a minimally invasive technique that enables detection of colorectal polyps and cancer. Normally, a patient's bowel is prepared with colonic lavage and gas insufflation prior to computed tomography (CT) scanning. An important step for 3D analysis of the image volume is segmentation of the colon. The high-contrast gas/tissue interface that exists in the colon lumen makes segmentation of the majority of the colon relatively easy; however, two factors inhibit automatic segmentation of the entire colon. First, the colon is not the only gas-filled organ in the data volume: lungs, small bowel, and stomach also meet this criteria. User-defined seed points placed in the colon lumen have previously been required to spatially isolate only the colon. Second, portions of the colon lumen may be obstructed by peristalsis, large masses, and/or residual feces. These complicating factors require increased user interaction during the segmentation process to isolate additional colon segments. To automate the segmentation of the colon, we have developed a method to locate seed points and segment the gas-filled lumen with no user supervision. We have also developed an automated approach to improve lumen segmentation by digitally removing residual contrast-enhanced fluid resulting from a new bowel preparation that liquefies and opacifies any residual feces.

  7. Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

    Data.gov (United States)

    National Aeronautics and Space Administration — Enhance capabilities for collaborative data analysis and modeling in Earth sciences. Develop components for automatic workflow capture, archiving and management....

  8. Research and implementation of software automatic test

    Science.gov (United States)

    Li-hong, LIAN

    2017-06-01

    With the fast development in IT technology nowadays, software is increasingly complex and large. Hundreds of people in the development team, thousands of modules and interfaces, across geographies and systems user are no longer a fresh thing. All of these put forward higher requirements for software testing. Due to the low cost of implementation and the advantage of effective inheritance and accumulation of test assets, software automation testing has gradually become one of the important means to ensure the quality of software for IT enterprises. This paper analyzes the advantages of automatic test, common misconceptions; puts forward unsuitable application scenarios and the best time to intervene; focus on the analysis of the feasibility of judging the interface automation test; and puts forward the function and elements of interface automatic test tools to have; provides a reference for large-scale project interface automated testing tool selection or custom development.

  9. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  10. Development of a System for Automatic Recognition of Speech

    Directory of Open Access Journals (Sweden)

    Roman Jarina

    2003-01-01

    Full Text Available The article gives a review of a research on processing and automatic recognition of speech signals (ARR at the Department of Telecommunications of the Faculty of Electrical Engineering, University of iilina. On-going research is oriented to speech parametrization using 2-dimensional cepstral analysis, and to an application of HMMs and neural networks for speech recognition in Slovak language. The article summarizes achieved results and outlines future orientation of our research in automatic speech recognition.

  11. A Flexible Dynamic System for Automatic Grading of Programming Exercises

    OpenAIRE

    Fonte, Daniela; Cruz, Daniela da; Gançarski, Alda Lopes; Henriques, Pedro Rangel

    2013-01-01

    The research on programs capable to automatically grade source code has been a subject of great interest to many researchers. Automatic Grading Systems (AGS) were born to support programming courses and gained popularity due to their ability to assess, evaluate, grade and manage the students' programming exercises, saving teachers from this manual task. This paper discusses semantic analysis techniques, and how they can be applied to improve the validation and assessment pr...

  12. Journal of EEA, Vol. 27, 2010 COMPUTATIONALLY EFFICIENT ...

    African Journals Online (AJOL)

    CBC

    Concentrated and uniformly-distributed loads have been ... L- l1- l2 = 0 l1 ≠ 0, l2 ≠ 0. ANALYSIS OF STRUCTURES WITH NON-. PRISMATIC MEMEBERS. In implementing the stiffness method of structural analysis [1], member stiffness terms and matrices ..... Automatic generation of generalized member-end actions in.

  13. Flying Qualities (Qualites de Vol)

    Science.gov (United States)

    1991-02-01

    de Vol Electriques . Experience de IAirbus A320 par J.Farineau et X.Lc tron MIL-STD- 1797 is Not a Cookbook 7 by D).B.Lcggctt and G.TIBlack Flying...Gideslip excursion in the dutc-h-roll mocl and the ILajoi- corsequence is its non~-osc& Ilatory behaviour. When dipole cancellation does nct occur laterai...single dipole pair in the each axis are near optima, interaxis closed-loop pilot-vehicle system (with crosstalk is minimized, etc. Just as the Izero

  14. A Level 1+ Probabilistic Safety Assessment of the high flux Australian reactor. Vol. 2. Appendix C: System analysis models and results

    International Nuclear Information System (INIS)

    1998-01-01

    This section contains the results of the quantitative system/top event analysis. Section C. 1 gives the basic event coding scheme. Section C.2 shows the master frequency file (MFF), which contains the split fraction names, the top events they belong to, the mean values of the uncertainty distribution that is generated by the Monte Carlo quantification in the System Analysis module of RISKMAN, and a brief description of each split fraction. The MFF is organized by the systems modeled, and within each system, the top events associated with the system. Section C.3 contains the fault trees developed for the system/top event models and the RISKMAN reports for each of the system/top event models. The reports are organized under the following system headings: Compressed/Service Air Supply (AIR); Containment Isolation System (CIS); Heavy Water Cooling System (D20); Emergency Core Cooling System (ECCS; Electric Power System (EPS); Light Water Cooling system (H20); Helium Gas System (HE); Mains Water System (MW); Miscellaneous Top Events (MISC); Operator Actions (OPER) Reactor Protection System (RPS); Space Conditioner System (SCS); Condition/Status Switch (SWITCH); RCB Ventilation System (VENT); No. 1 Storage Block Cooling System (SB)

  15. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  16. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  17. Terminal Sliding Mode Tracking Controller Design for Automatic Guided Vehicle

    Science.gov (United States)

    Chen, Hongbin

    2018-03-01

    Based on sliding mode variable structure control theory, the path tracking problem of automatic guided vehicle is studied, proposed a controller design method based on the terminal sliding mode. First of all, through analyzing the characteristics of the automatic guided vehicle movement, the kinematics model is presented. Then to improve the traditional expression of terminal sliding mode, design a nonlinear sliding mode which the convergence speed is faster than the former, verified by theoretical analysis, the design of sliding mode is steady and fast convergence in the limited time. Finally combining Lyapunov method to design the tracking control law of automatic guided vehicle, the controller can make the automatic guided vehicle track the desired trajectory in the global sense as well as in finite time. The simulation results verify the correctness and effectiveness of the control law.

  18. An automatic composition model of Chinese folk music

    Science.gov (United States)

    Zheng, Xiaomei; Li, Dongyang; Wang, Lei; Shen, Lin; Gao, Yanyuan; Zhu, Yuanyuan

    2017-03-01

    The automatic composition has achieved rich results in recent decades, including Western and some other areas of music. However, the automatic composition of Chinese music is less involved. After thousands of years of development, Chinese folk music has a wealth of resources. To design an automatic composition mode, learn the characters of Chinese folk melody and imitate the creative process of music is of some significance. According to the melodic features of Chinese folk music, a Chinese folk music composition based on Markov model is proposed to analyze Chinese traditional music. Folk songs with typical Chinese national characteristics are selected for analysis. In this paper, an example of automatic composition is given. The experimental results show that this composition model can produce music with characteristics of Chinese folk music.

  19. Automatic contact in DYNA3D for vehicle crashworthiness

    International Nuclear Information System (INIS)

    Whirley, R.G.; Engelmann, B.E.

    1994-01-01

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit, nonlinear, finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. The authors have used a new four-step automatic contact algorithm. Key aspects of the proposed method include (1) automatic identification of adjacent and opposite surfaces in the global search phase, and (2) the use of a smoothly varying surface normal that allows a consistent treatment of shell intersection and corner contact conditions without ad hoc rules. Three examples are given to illustrate the performance of the newly proposed algorithm in the public DYNA3D code

  20. Preliminary comparison with 40 CFR Part 191, Subpart B for the Waste Isolation Pilot Plant, December 1991. Vol. 4: Uncertainty and sensitivity analysis results

    International Nuclear Information System (INIS)

    Helton, J.C.; Garner, J.W.; Rechard, R.P.; Rudeen, D.K.; Swift, P.N.

    1992-04-01

    The most appropriate conceptual model for performance assessment at the Waste Isolation Pilot Plant (WIPP) is believed to include gas generation due to corrosion and microbial action in the repository and a dual-porosity (matrix and fracture porosity) representation for solute transport in the Culebra Dolomite Member of the Rustler Formation. Under these assumptions, complementary cumulative distribution functions (CCDFs) summarizing radionuclide releases to the accessible environment due to both cuttings removal and groundwater transport fall substantially below the release limits promulgated by the Environmental Protection Agency (EPA). This is the case even when the current estimates of the uncertainty in analysis inputs are incorporated into the performance assessment. The best-estimate performance-assessment results are dominated by cuttings removal. The releases to the accessible environment due to groundwater transport make very small contributions to the total release. The variability in the distribution of CCDFs that must be considered in comparisons with the EPA release limits is dominated by the variable LAMBDA (rate constant in Poisson model for drilling intrusions). The variability in releases to the accessible environment due to individual drilling intrusions is dominated by DBDIAM (drill bit diameter). Most of the imprecisely known variables considered in the 1991 WIPP performance assessment relate to radionuclide releases to the accessible environment due to groundwater transport. For a single borehole (i.e., an E2-type scenario), whether or not a release from the repository to the Culebra even occurs is controlled by the variable SALPERM (Salado permeability), with no releases for small values (i.e., -21 m 2 ) of this variable. When SALPERM is small, the repository never fills with brine and so there is no flow up an intruding borehole that can transport radionuclides to the Culebra. Further, releases that do reach the Culebra for larger values of

  1. Fully Automatic Cross-Associations

    Science.gov (United States)

    2004-08-01

    Chameleon [7], [8]; see also [9]). Several of the clustering methods might suffer from the dimensionality curse (like the ones that require a co-variance...clusters, irrespective of their size. Figure 5(b) shows GRANTS, which consists of NSF grant proposal abstracts in several disciplines, such as genetics ... Chameleon : Hierarchical clustering using dynamic modeling,” IEEE Computer, vol. 32, no. 8, pp. 68–75, 1999. [8] A. Hinneburg and D. A. Keim, “An

  2. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  3. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation

  4. An automatic micro-sequential injection bead injection lab-on-valve (muSI-BI-LOV) assembly for speciation analysis of ultra trace levels of Cr(III) and Cr(VI) incorporating on-line chemical reduction and employing detection by electrothermal atomic absorption spectrometry (ETAAS)

    DEFF Research Database (Denmark)

    Long, Xiangbao; Miró, Manuel; Hansen, Elo Harald

    2005-01-01

    A novel, miniaturized micro-sequential injection Lab-on-Valve (muSI-LOV) system hyphenated with electrothermal atomic absorption spectrometry (ETAAS) is proposed for the automatic preconcentration and speciation analysis of Cr(III) and Cr(VI) utilizing solid-phase extraction on hydrophilic...

  5. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 2: Novel system Architecture, Information/Knowledge Representation, Algorithm Design and Implementation

    Directory of Open Access Journals (Sweden)

    Luigi Boschetti

    2012-09-01

    Full Text Available According to literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the Quality Indexes of Operativeness (OQIs of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. Based on an original multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches, the first part of this work promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification capable of accomplishing image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the present second part of this work, a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design, (b information/knowledge representation, (c algorithm design and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time, multi-sensor, multi-resolution, application-independent Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a symbolic syntactic inference system, like SIAM™, is made available to the RS community for operational use in a RS-IUS pre-attentive vision first stage

  6. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  7. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  8. Traduction automatique et terminologie automatique (Automatic Translation and Automatic Terminology

    Science.gov (United States)

    Dansereau, Jules

    1978-01-01

    An exposition of reasons why a system of automatic translation could not use a terminology bank except as a source of information. The fundamental difference between the two tools is explained and examples of translation and mistranslation are given as evidence of the limits and possibilities of each process. (Text is in French.) (AMH)

  9. An automatic sample changer for gamma spectrometry

    International Nuclear Information System (INIS)

    Andrews, D.J.

    1984-01-01

    An automatic sample changer for gamma spectrometry is described which is designed for large-volume, low radioactivity environmental samples of various sizes up to maximum dimensions 100 mm diameter x 60 mm high. The sample changer is suitable for use with most existing gamma spectrometry systems which utilize GeLi or NaI detectors in vertical mode, in conjunction with a pulse height analyzer having auto-cycle and suitable data output facilities; it is linked to a Nuclear Data ND 6620 computer-based analysis system. (U.K.)

  10. New automatic radiation monitoring network in Slovenia

    International Nuclear Information System (INIS)

    Cindro, M.; Vokal Nemec, B.

    2006-01-01

    The Slovenian Nuclear Safety Administration gathers all on-line dose rate data measured by the various automatic networks operating throughout the territory of Slovenia. With the help of the PHARE financing program and in close cooperation with the Environmental agency of RS the upgrade of the existing network begun in 2005 and was finished in March 2006. The upgrade provided new measuring sites with all relevant data needed in case of a radiological accident. Even bigger improvement was made in the area of data presentation and analysis, which was the main shortcoming of the old system. (author)

  11. Development of an automatic image scanner for dosimetry analysis; Developpements d'une analyse automatique d'image pour le comptage de films dosimetriques

    Energy Technology Data Exchange (ETDEWEB)

    Berger, F. [Univ. de France Comte, Lab. de Microanalyses Nucleaires, U.F.R. des Sciences et de Techniques, Besancon (France); Klein, D. [Laboratoire de Metrologie des Interfaces Techniques, Belfort Cedex (France); Barillon, R.; Chambaudet, A. [Univ. de France Comte, Lab. de Microanalyses Nucleaires, U.F.R. des Sciences et de Techniques, Besancon (France)

    1992-07-01

    Solid nuclear track detector in dosimetry are necessary for numerous uses. We have developed image analysis for scanning and measuring nuclear tracks (alpha, proton and fission fragment) in various detectors. The track density makes it possible to calculate the activity concentration to which the detector has been exposed. Special computer programs enable us to count both low and high densities. (author)

  12. S¯adhan¯a Vol. 27, 2002 Subject Index

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Automatic recognition of printed Oriya script. 23. Charater recognition. Document image analysis: A primer. 3. Classifers combination. Devnagari numerical recognition by combin- ing decision of mutiple connectionists clas- sifers. 59. Classification. Two-tier architecture for unconstrained hand- written character recognitiion.

  13. Accuracy of Automatic Cephalometric Software on Landmark Identification

    Science.gov (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (p4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  14. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  15. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  16. Automatic analysis of quality of images from X-ray digital flat detectors; Analyse automatique de la qualite des images issues de detecteurs plats a rayons X

    Energy Technology Data Exchange (ETDEWEB)

    Le Meur, Y.

    2009-04-15

    Since last decade, medical imaging has grown up with the development of new digital imaging techniques. In the field of X-ray radiography, new detectors replace progressively older techniques, based on film or x-ray intensifiers. These digital detectors offer a higher sensibility and reduced overall dimensions. This work has been prepared with Trixell, the world leading company in flat detectors for medical radiography. It deals with quality control on digital images stemming from these detectors. High quality standards of medical imaging impose a close analysis of the defects that can appear on the images. This work describes a complete process for quality analysis of such images. A particular focus is given on the detection task of the defects, thanks to methods well adapted to our context of spatially correlated defects in noise background. (author)

  17. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    Science.gov (United States)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  18. Automatic fault tree construction with RIKKE - a compendium of examples. Vol. 2

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1982-02-01

    This second volume describes the construction of fault trees for systems with loops, including control and safety loops. It also gives a short summary of the event coding scheme used in the FTLIB component model library. (author)

  19. Journal of EAEA, Vol. 11, 1994

    African Journals Online (AJOL)

    . Signals, IEEE Trans. on Instrum. & Meas.,. (41. VOL 37, No. 4, pp510-514, Dec 1986. George V. & Ramesh R. CAMAC - A. Microprocessor Based System for Adaptable. Calibration and Linearization of Hall Effect. Position Sensor, IEEE Trans.

  20. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis

    Directory of Open Access Journals (Sweden)

    José Alberto Benítez

    2017-01-01

    Full Text Available There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.