WorldWideScience

Sample records for automatic vol analysis

  1. The change of cerebral blood flow after heart transplantation in congestive heart failure: a voxel-based and automatic VOl analysis of Tc-99m ECD SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Hong, I. K.; Kim, J. J.; Lee, C. H.; Lim, K. C.; Moon, D. H.; Rhu, J. S.; Kim, J. S. [Asan Medical Center, Seoul (Korea, Republic of)

    2007-07-01

    To investigate the change of global and regional cerebral blood flow after heart transplantation (HT) in congestive heart failure (CHF) patients. Twenty-one patients with CHF who underwent HT (45{+-}12 yrs, M/F=19/2) and 10 healthy volunteers (39{+-}13 yrs, M/F = 7/3) were prospectively included. All patients underwent echocardiography and radionuclide angiography including brain and aorta with brain SPECT which was performed after iv bolus injection of Tc-99m ECD (740MBq) before (175{+-}253 days) and after (129{+-}82 days) HT. Patients were divided into two groups according to the interval between HT and postoperative SPECT [early follow-up (f/u): <6 mo, n=14; late f/u: >6 mo, n=7]. Global CBF (gCBF) of bilateral hemispheres were calculated by Patlak graphical analysis. Absolute rCBF map was obtained from brain SPECT by Lassen's correction algorithm. Age-corrected voxel-based analysis using SPM2 and automatic VOl analysis were performed to assess the rCBF change. Cardiac ejection fraction of all patients improved after HT (20.8%{yields}64.0%). gCBF was reduced compared to normal before HT (35.7{+-}3.9 vs. 49.1{+-}3.0 ml/100g/min; p<0.001) and improved postoperatively (46.6{+-}5.4, p<0.001). The preoperative gCBFs of early and late f/u group were not different (34.6{+-}3.2 vs. 38.0{+-}4.4, p=0.149) but postoperative gCBF (43.9{+-}3.7) of late f/u group was higher than those (52.0{+-}4.0) of early f/u group (p<0.001). On voxel-based analysis, preoperative rCBF was reduced in entire brain but most severely in bilateral superior and inferior frontal cortex, supplementary motor area, precuneus and anterior cingulum, compared to normals (uncorrected p<0.001). After HT, rCBF of these areas improved more significantly in late f/u group than in early f/u group but still lower than normals. Global CBF was significantly reduced in CHF patients and improved after HT. rCBFs of the frontal cortex, precuneus and cingulum were most severely reduced and slowly improved after

  2. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  3. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  4. A background to risk analysis. Vol. 3

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justifi- cation or evaluation, this is given in the form of a chapter appenix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 3 contains chapters on quantification of risk, failure and accident probability, risk analysis and design, and examles of risk analysis for process plant. (BP)

  5. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  6. A background to risk analysis. Vol. 2

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 2 treats generic methods of qualitative failure analysis. (BP)

  7. A background to risk analysis. Vol. 4

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 4 treats human error in plant operation. (BP)

  8. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  9. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  10. Automatic individualized contrast medium dosage during hepatic computed tomography by using computed tomography dose index volume (CTDI{sub vol})

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Anders; Cederlund, Kerstin; Aspelin, Peter; Brismar, Torkel B. [Intervention and Technology at Karolinska Institutet, Department of Clinical Science, Division of Medical Imaging and Technology, Stockholm (Sweden); Karolinska University Hospital in Huddinge, Department of Radiology, Stockholm (Sweden); Bjoerk, Jonas [FoU-centrum Skaane Skaanes Universitetssjukhus i Lund, Lund (Sweden); Nyman, Ulf [University of Lund, Department of Diagnostic Radiology, Lasarettet Trelleborg, Trelleborg (Sweden)

    2014-08-15

    To compare hepatic parenchymal contrast media (CM) enhancement during multi-detector row computed tomography (MDCT) and its correlation with volume pitch-corrected computed tomography dose index (CTDI{sub vol}) and body weight (BW). One hundred patients referred for standard three-phase thoraco-abdominal MDCT examination were enrolled. BW was measured in the CT suite. Forty grams of iodine was administered intravenously (iodixanol 320 mg I/ml at 5 ml/s or iomeprol 400 mg I/ml at 4 ml/s) followed by a 50-ml saline flush. CTDI{sub vol} presented by the CT equipment during the parenchymal examination was recorded. The CM enhancement of the liver was defined as the attenuation HU of the liver parenchyma during the hepatic parenchymal phase minus the attenuation in the native phase. Liver parenchymal enhancement was negatively correlated to both CTDI{sub vol} (r = -0.60) and BW (r = -0.64), but the difference in correlation between those two was not significant. CTDI{sub vol} may replace BW when adjusting CM doses to body size. This makes it potentially feasible to automatically individualize CM dosage by CT. (orig.)

  11. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  13. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Goujon de Beauvivier, M.; Perez, J.-J.

    1979-01-01

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry [fr

  14. Automatic analysis of charged particle spectra

    International Nuclear Information System (INIS)

    Seres, Z.; Kiss, A.

    1975-11-01

    A computer program system is developed for off-line automatic analysis of a series of charged particle spectra measured by solid-state detectors and collected on magnetic tapes. The procedure results in complete angular distributions for the excited levels of the final nucleus up to about 15 MeV. (orig.) [de

  15. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  16. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  17. Automatic dirt trail analysis in dermoscopy images.

    Science.gov (United States)

    Cheng, Beibei; Joe Stanley, R; Stoecker, William V; Osterwise, Christopher T P; Stricklin, Sherea M; Hinton, Kristen A; Moss, Randy H; Oliviero, Margaret; Rabinovitz, Harold S

    2013-02-01

    Basal cell carcinoma (BCC) is the most common cancer in the US. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 are under a receiver operating characteristic curve using a leave-one-out approach. Results obtained from this study show that automatic detection of dirt trails in dermoscopic images of BCC is feasible. This is important because of the large number of these skin cancers seen every year and the challenge of discovering these earlier with instrumentation. © 2011 John Wiley & Sons A/S.

  18. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  19. Application of automatic image analysis in wood science

    Science.gov (United States)

    Charles W. McMillin

    1982-01-01

    In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...

  20. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  1. Towards automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, M.; Quist, M.; Spreeuwers, Lieuwe Jan; Paetsch, I.; Al-Saadi, N.; Nagel, E.

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and reliable automatic image analysis methods. This paper focuses on the automatic evaluation of

  2. Dynamic Analysis of a Pendulum Dynamic Automatic Balancer

    Directory of Open Access Journals (Sweden)

    Jin-Seung Sohn

    2007-01-01

    Full Text Available The automatic dynamic balancer is a device to reduce the vibration from unbalanced mass of rotors. Instead of considering prevailing ball automatic dynamic balancer, pendulum automatic dynamic balancer is analyzed. For the analysis of dynamic stability and behavior, the nonlinear equations of motion for a system are derived with respect to polar coordinates by the Lagrange's equations. The perturbation method is applied to investigate the dynamic behavior of the system around the equilibrium position. Based on the linearized equations, the dynamic stability of the system around the equilibrium positions is investigated by the eigenvalue analysis.

  3. Describing Old Czech Declension Patterns for Automatic Text Analysis

    Czech Academy of Sciences Publication Activity Database

    Jínová, P.; Lehečka, Boris; Oliva jr., Karel

    -, č. 13 (2014), s. 7-17 ISSN 1579-8372 Institutional support: RVO:68378092 Keywords : Old Czech morphology * declension patterns * automatic text analysis * i-stems * ja-stems Subject RIV: AI - Linguistics

  4. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data.......Continuous recording of intraluminal pressures for extended periods of time is currently regarded as a valuable method for detection of esophageal motor abnormalities. A subsequent automatic analysis of the resulting motility data relies on strict mathematical criteria for recognition of pressure...

  5. Automatic sample changer for neutron activation analysis at CDTN, Brazil

    International Nuclear Information System (INIS)

    Aimore Dutra Neto; Oliveira Pelaes, Ana Clara; Jacimovic, Radojko

    2018-01-01

    An automatic sample changer was recently developed and installed in the Neutron Activation Analysis (NAA) Laboratory. The certified reference material BCR-320R, Channel Sediment, was analysed in order to verify the reliability of the results obtained by NAA, k 0 -standardisation method, using this automatic system during the gamma-ray measurement step. The results were compared to those manually obtained. The values pointed out that the automatic sample changer is working properly. This changer will increase the productiveness of the neutron activation technique applied at Nuclear Technology Development Centre, CDTN/CNEN expanding its competitiveness as an analytical technique in relation to other techniques. (author)

  6. Cost-benefit analysis of the ATM automatic deposit service

    Directory of Open Access Journals (Sweden)

    Ivica Županović

    2015-03-01

    Full Text Available Bankers and other financial experts have analyzed the value of automated teller machines (ATM in terms of growing consumer demand, rising costs of technology development, decreasing profitability and market share. This paper presents a step-by-step cost-benefit analysis of the ATM automatic deposit service. The first step is to determine user attitudes towards using ATM automatic deposit service by using the Technology Acceptance Model (TAM. The second step is to determine location priorities for ATMs that provide automatic deposit services using the Analytic Hierarchy Process (AHP model. The results of the previous steps enable a highly efficient application of cost-benefit analysis for evaluating costs and benefits of automatic deposit services. To understand fully the proposed procedure outside of theoretical terms, a real-world application of a case study is conducted.

  7. Development of an automatic identification algorithm for antibiogram analysis

    OpenAIRE

    Costa, LFR; Eduardo Silva; Noronha, VT; Ivone Vaz-Moreira; Olga C Nunes; de Andrade, MM

    2015-01-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. ALA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using suscepti...

  8. Automatized system of radioactive material analysis

    International Nuclear Information System (INIS)

    Pchelkin, V.A.; Sviderskij, M.F.; Litvinov, V.A.; Lavrikov, S.A.

    1979-01-01

    An automatized system has been developed for the identification of substance, element and isotope content of radioactive materials on the basis of data obtained for studying physical-chemical properties of substances (with the help of atomic-absorption spectrometers, infrared spectrometer, mass-spectrometer, derivatograph etc.). The system is based on the following principles: independent operation of each device; a possibility of increasing the number of physical instruments and devices; modular properties of engineering and computer means; modular properties and standardization of mathematical equipment, high reliability of the system; continuity of programming languages; a possibility of controlling the devices with the help of high-level language, typification of the system; simple and easy service; low cost. Block-diagram of the system is given

  9. Journal of Computer Science and Its Application - Vol 20, No 2 (2013)

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application - Vol 20, No 2 (2013) ... Fuzzy analysis and adaptive anthropometry model for object identification in surveillance system ... Natural language processing techniques for automatic test questions ...

  10. Automatic size analysis of coated fuel particles

    International Nuclear Information System (INIS)

    Wallisch, K.; Koss, P.

    1977-01-01

    The determination of the diameter, coating thickness, and sphericity of coated fuel particles by conventional methods is very time consuming. Therefore, statistical data can only be obtained with limited accuracy. An alternative method is described that avoids these disadvantages by utilizing a fast optical data-collecting system of high accuracy. This system allows the determination of the diameter of particles in the range between 100 and 1500 μm, with an accuracy of better than +-2 μm and with a rate of 100 particles per second. The density and thickness of coating layers can be determined by comparing the data obtained before and after coating, taking into account the relative increase of weight. A special device allows the automatic determination of the sphericity of single particles as well as the distribution in a batch. This device measures 50 to 100 different diameters of each particle per second. An on-line computer stores the measured data and calculates all parameters required, e.g., number of particles measured, particle diameter, standard deviation, diameter limiting values, average particle volume, average particle surface area, and the distribution of sphericity in absolute and percent form

  11. Automatic analysis of signals during Eddy currents controls

    International Nuclear Information System (INIS)

    Chiron, D.

    1983-06-01

    A method and the corresponding instrument have been developed for automatic analysis of Eddy currents testing signals. This apparatus enables at the same time the analysis, every 2 milliseconds, of two signals at two different frequencies. It can be used either on line with an Eddy Current testing instrument or with a magnetic tape recorder [fr

  12. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  13. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman

    2015-06-01

    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  14. Automatic measurement system for light element isotope analysis

    International Nuclear Information System (INIS)

    Satake, Hiroshi; Ikegami, Kouichi.

    1990-01-01

    The automatic measurement system for the light element isotope analysis was developed by installing the specially designed inlet system which was controlled by a computer. The microcomputer system contains specific interface boards for the inlet system and the mass spectrometer, Micromass 602 E. All the components of the inlet and the computer system installed are easily available in Japan. Ten samples can be automatically measured as a maximum of. About 160 minutes are required for 10 measurements of δ 18 O values of CO 2 . Thus four samples can be measured per an hour using this system, while usually three samples for an hour using the manual operation. The automatized analysis system clearly has an advantage over the conventional method. This paper describes the details of this automated system, such as apparatuses used, the control procedure and the correction for reliable measurement. (author)

  15. Social Signals, their function, and automatic analysis: A survey

    NARCIS (Netherlands)

    Vinciarelli, Alessandro; Pantic, Maja; Bourlard, Hervé; Pentland, Alex

    2008-01-01

    Social Signal Processing (SSP) aims at the analysis of social behaviour in both Human-Human and Human-Computer interactions. SSP revolves around automatic sensing and interpretation of social signals, complex aggregates of nonverbal behaviours through which individuals express their attitudes

  16. Automatic Online Lecture Highlighting Based on Multimedia Analysis

    Science.gov (United States)

    Che, Xiaoyin; Yang, Haojin; Meinel, Christoph

    2018-01-01

    Textbook highlighting is widely considered to be beneficial for students. In this paper, we propose a comprehensive solution to highlight the online lecture videos in both sentence- and segment-level, just as is done with paper books. The solution is based on automatic analysis of multimedia lecture materials, such as speeches, transcripts, and…

  17. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  18. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  19. Automatic Text Analysis Based on Transition Phenomena of Word Occurrences

    Science.gov (United States)

    Pao, Miranda Lee

    1978-01-01

    Describes a method of selecting index terms directly from a word frequency list, an idea originally suggested by Goffman. Results of the analysis of word frequencies of two articles seem to indicate that the automated selection of index terms from a frequency list holds some promise for automatic indexing. (Author/MBR)

  20. A computer program for automatic gamma-ray spectra analysis

    International Nuclear Information System (INIS)

    Hiromura, Kazuyuki

    1975-01-01

    A computer program for automatic analysis of gamma-ray spectra obtained with a Ge(Li) detector is presented. The program includes a method by comparing the successive values of experimental data for the automatic peak finding and method of leastsquares for the peak fitting. The peak shape in the fitting routine is a 'modified Gaussian', which consists of two different Gaussians with the same height joined at the centroid. A quadratic form is chosen as a function representing the background. A maximum of four peaks can be treated in the fitting routine by the program. Some improvements in question are described. (auth.)

  1. Automatic movie skimming with general tempo analysis

    Science.gov (United States)

    Lee, Shih-Hung; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    Story units are extracted by general tempo analysis including tempos analysis including tempos of audio and visual information in this research. Although many schemes have been proposed to successfully segment video data into shots using basic low-level features, how to group shots into meaningful units called story units is still a challenging problem. By focusing on a certain type of video such as sport or news, we can explore models with the specific application domain knowledge. For movie contents, many heuristic rules based on audiovisual clues have been proposed with limited success. We propose a method to extract story units using general tempo analysis. Experimental results are given to demonstrate the feasibility and efficiency of the proposed technique.

  2. Automatic circuit analysis based on mask information

    International Nuclear Information System (INIS)

    Preas, B.T.; Lindsay, B.W.; Gwyn, C.W.

    1976-01-01

    The Circuit Mask Translator (CMAT) code has been developed which converts integrated circuit mask information into a circuit schematic. Logical operations, pattern recognition, and special functions are used to identify and interconnect diodes, transistors, capacitors, and resistances. The circuit topology provided by the translator is compatible with the input required for a circuit analysis program

  3. Handbook of nuclear engineering: vol 1: nuclear engineering fundamentals; vol 2: reactor design; vol 3: reactor analysis; vol 4: reactors of waste disposal and safeguards

    CERN Document Server

    2013-01-01

    The Handbook of Nuclear Engineering is an authoritative compilation of information regarding methods and data used in all phases of nuclear engineering. Addressing nuclear engineers and scientists at all academic levels, this five volume set provides the latest findings in nuclear data and experimental techniques, reactor physics, kinetics, dynamics and control. Readers will also find a detailed description of data assimilation, model validation and calibration, sensitivity and uncertainty analysis, fuel management and cycles, nuclear reactor types and radiation shielding. A discussion of radioactive waste disposal, safeguards and non-proliferation, and fuel processing with partitioning and transmutation is also included. As nuclear technology becomes an important resource of non-polluting sustainable energy in the future, The Handbook of Nuclear Engineering is an excellent reference for practicing engineers, researchers and professionals.

  4. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  5. Classification of scintigrams on the base of an automatic analysis

    International Nuclear Information System (INIS)

    Vidyukov, V.I.; Kasatkin, Yu.N.; Kal'nitskaya, E.F.; Mironov, S.P.; Rotenberg, E.M.

    1980-01-01

    The stages of drawing a discriminative system based on self-education for an automatic analysis of scintigrams have been considered. The results of the classification of 240 scintigrams of the liver into ''normal'', ''diffuse lesions'', ''focal lesions'' have been evaluated by medical experts and computer. The accuracy of the computerized classification was 91.7%, that of the experts-85%. The automatic analysis methods of scintigrams of the liver have been realized using the specialized MDS system of data processing. The quality of the discriminative system has been assessed on 125 scintigrams. The accuracy of the classification is equal to 89.6%. The employment of the self-education; methods permitted one to single out two subclasses depending on the severity of diffuse lesions

  6. Coherence measures in automatic time-migration velocity analysis

    International Nuclear Information System (INIS)

    Maciel, Jonathas S; Costa, Jessé C; Schleicher, Jörg

    2012-01-01

    Time-migration velocity analysis can be carried out automatically by evaluating the coherence of migrated seismic events in common-image gathers (CIGs). The performance of gradient methods for automatic time-migration velocity analysis depends on the coherence measures used as the objective function. We compare the results of four different coherence measures, being conventional semblance, differential semblance, an extended differential semblance using differences of more distant image traces and the product of the latter with conventional semblance. In our numerical experiments, the objective functions based on conventional semblance and on the product of conventional semblance with extended differential semblance provided the best velocity models, as evaluated by the flatness of the resulting CIGs. The method can be easily extended to anisotropic media. (paper)

  7. Analysis of Phonetic Transcriptions for Danish Automatic Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    2013-01-01

    Automatic speech recognition (ASR) relies on three resources: audio, orthographic transcriptions and a pronunciation dictionary. The dictionary or lexicon maps orthographic words to sequences of phones or phonemes that represent the pronunciation of the corresponding word. The quality of a speech....... The analysis indicates that transcribing e.g. stress or vowel duration has a negative impact on performance. The best performance is obtained with coarse phonetic annotation and improves performance 1% word error rate and 3.8% sentence error rate....

  8. Radiation dosimetry by automatic image analysis of dicentric chromosomes

    International Nuclear Information System (INIS)

    Bayley, R.; Carothers, A.; Farrow, S.; Gordon, J.; Ji, L.; Piper, J.; Rutovitz, D.; Stark, M.; Chen, X.; Wald, N.; Pittsburgh Univ., PA

    1991-01-01

    A system for scoring dicentric chromosomes by image analysis comprised fully automatic location of mitotic cells, automatic retrieval, focus and digitisation at high resolution, automatic rejection of nuclei and debris and detection and segmentation of chromosome clusters, automatic centromere location, and subsequent rapid interactive visual review of potential dicentric chromosomes to confirm positives and reject false positives. A calibration set of about 15000 cells was used to establish the quadratic dose response for 60 Co γ-irradiation. The dose-response function parameters were established by a maximum likelihood technique, and confidence limits in the dose response and in the corresponding inverse curve, of estimated dose for observed dicentric frequency, were established by Monte Carlo techniques. The system was validated in a blind trial by analysing a test comprising a total of about 8000 cells irradiated to 1 of 10 dose levels, and estimating the doses from the observed dicentric frequency. There was a close correspondence between the estimated and true doses. The overall sensitivity of the system in terms of the proportion of the total population of dicentrics present in the cells analysed that were detected by the system was measured to be about 40%. This implies that about 2.5 times more cells must be analysed by machine than by visual analysis. Taking this factor into account, the measured review time and false positive rates imply that analysis by the system of sufficient cells to provide the equivalent of a visual analysis of 500 cells would require about 1 h for operator review. (author). 20 refs.; 4 figs.; 5 tabs

  9. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...... the first important step in the analysis of human motion. By separating foreground from background the subsequent analysis can be focused and efficient. This thesis presents a robust background subtraction method that can be initialized with foreground objects in the scene and is capable of handling...

  10. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  11. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  12. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic......, uncommitted machine-learning based framework. Six different classifiers were evaluated in cross-validation schemes and the results showed that the presence of OA can be quantified by a bone structure marker. The performance of the developed marker reached a generalization area-under-the-ROC (AUC) of 0...

  13. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  14. ASTRA - an automatic system for transport analysis in a tokamak

    International Nuclear Information System (INIS)

    Pereverzev, G.V.; Yushmanov, P.N.; Dnestrovskii, A.Yu.; Polevoi, A.R.; Tarasjan, K.N.; Zakharov, L.E.

    1991-08-01

    The set of codes described here - ASTRA (Automatic System of Transport Analysis) - is a flexible and effective tool for the study of transport mechanisms in reactor-oriented facilities of the tokamak type. Flexibility is provided within the ASTRA system by a wide choice of standard relationships, functions and subroutines representing various transport coefficients, methods of auxiliary heating and other physical processes in the tokamak plasma, as well as by the possibility of pre-setting transport equations and variables for data output in a simple and conseptually transparent form. The transport code produced by the ASTRA system provides an adequate representation of the discharges for present experimental conditions. (orig.)

  15. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  16. Automatic particle-size analysis of HTGR nuclear fuel microspheres

    International Nuclear Information System (INIS)

    Mack, J.E.

    1977-01-01

    An automatic particle-size analyzer (PSA) has been developed at ORNL for measuring and counting samples of nuclear fuel microspheres in the diameter range of 300 to 1000 μm at rates in excess of 2000 particles per minute, requiring no sample preparation. A light blockage technique is used in conjunction with a particle singularizer. Each particle in the sample is sized, and the information is accumulated by a multi-channel pulse height analyzer. The data are then transferred automatically to a computer for calculation of mean diameter, standard deviation, kurtosis, and skewness of the distribution. Entering the sample weight and pre-coating data permits calculation of particle density and the mean coating thickness and density. Following this nondestructive analysis, the sample is collected and returned to the process line or used for further analysis. The device has potential as an on-line quality control device in processes dealing with spherical or near-spherical particles where rapid analysis is required for process control

  17. Independent component analysis for automatic note extraction from musical trills

    Science.gov (United States)

    Brown, Judith C.; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  18. Algorithm for automatic analysis of electro-oculographic data.

    Science.gov (United States)

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  19. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  20. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  1. Automatic visual tracking and social behaviour analysis with multiple mice.

    Directory of Open Access Journals (Sweden)

    Luca Giancardo

    Full Text Available Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain and BTBR T+tf/J (a mouse model for autism spectrum disorders. Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2 interacting mice, and its versatility to deal with different

  2. Modern trends in activation analysis. Vols. 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Henkelmann, R; Kim, J I; Lux, F; Staerk, H; Zeisler, R [eds.

    1976-01-01

    Two volumes contain 164 special papers presented at the international conference on activation analysis, including five plenary lectures which give a general survey of modern trends and possibilities of applications in fields where activation analysis is capable of solving problems which are hitherto unsolved. The emphasis during this meeting is put on applications, though other subjects (e.g. sampling, homogeneity of samples, instrumental development, computer evaluation of gamma spectra, comparisons with other analytical methods) are also dealt with. The proceedings contain lectures on applications in the fields of biology, biomedicine, archaeology, the arts, forensic sciences, environmental research, ecology, materials research, geo- and cosmo-sciences, and other individual applications.

  3. Analysis of measurements on wind turbine arrays. Vol. 1

    International Nuclear Information System (INIS)

    Hoeg, E.

    1990-12-01

    In 1989 a Danish electric power company initiated an analysis of eight wind turbine arrays. Data from this project is presented together with the explained results of the analyses and the output variations for individual arrays and for systems within the arrays. The models for prognosis are compared and evaluated in order to find that which is most effective. (AB)

  4. Experiment data acquisition and analysis system. Vol. 1

    International Nuclear Information System (INIS)

    Busch, F.; Croome, D.; Goeringer, H.; Hartmann, V.; Lowsky, J.; Marinescu, D.; Richter, M.; Winkelmann, K.

    1983-08-01

    The Experiment Data Acquisition and Analysis System EDAS was created to acquire and analyze data collected in experiments carried out at the heavy ion accelerator UNILAC. It has been available since 1975 and has become the most frequently used system for evaluating experiments at GSI. EDAS has undergone constant development, and the many enhancements make this completely revised third edition of the EDAS manual necessary. EDAS consists of two sub-systems: GOLDA for experimental data acquisition on PDP-11's and SATAN mainly for off-line analysis in replay mode on large IBM mainframes. The capacity of one IBM 3081 CPU is mainly dedicated to EDAS processing and is almost fully utilized by this application. More than 200 users from GSI as well as from collaborating laboratories and universities use SATAN in more than 100 sessions daily needing 10 to 20 hours of user CPU time. EDAS is designed as an open system. (orig./HSI)

  5. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk [FNC Technology Co., Yongin (Korea, Republic of); Choi, Byung Pil [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure.

  6. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    International Nuclear Information System (INIS)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk; Choi, Byung Pil

    2016-01-01

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure

  7. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  8. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    Science.gov (United States)

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  9. Handbook of nuclear data for neutron activation analysis. Vol. I

    International Nuclear Information System (INIS)

    Hnatowicz, V.

    1986-01-01

    The first part of a two-volume book which is meant for experimentalists working in instrumental activation analysis and related fields, such as nuclear metrology, materials testing and environmental studies. The volume describes the basic processes of gamma-ray interaction with matter as well as the important phenomena affecting gamma-ray spectra formation in semiconductor spectrometers. A brief account is also given of computation methods commonly employed for spectra evaluation. The results rather than detailed derivations are stressed. A great deal of material si divided into five chapters and nine appendices. The inclusion of many tables of significant spectroscopic data should make the text a useful handbook for those dealing with multi-channel gamma-ray spectra. (author) 26 figs., 82 tabs., 334 refs

  10. Automatic adventitious respiratory sound analysis: A systematic review.

    Science.gov (United States)

    Pramono, Renard Xaviero Adhi; Bowyer, Stuart; Rodriguez-Villegas, Esther

    2017-01-01

    Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. A total of 77 reports from the literature were included in this review. 55 (71.43%) of the studies focused on wheeze, 40 (51.95%) on crackle, 9 (11.69%) on stridor, 9 (11

  11. [Automatic Extraction and Analysis of Dosimetry Data in Radiotherapy Plans].

    Science.gov (United States)

    Song, Wei; Zhao, Di; Lu, Hong; Zhang, Biyun; Ma, Jun; Yu, Dahai

    To improve the efficiency and accuracy of extraction and analysis of dosimetry data in radiotherapy plans for a batch of patients. With the interface function provided in Matlab platform, a program was written to extract the dosimetry data exported from treatment planning system in DICOM RT format and exported the dose-volume data to an Excel file with the SPSS compatible format. This method was compared with manual operation for 14 gastric carcinoma patients to validate the efficiency and accuracy. The output Excel data were compatible with SPSS in format, the dosimetry data error for PTV dose interval of 90%-98%, PTV dose interval of 99%-106% and all OARs were -3.48E-5 ± 3.01E-5, -1.11E-3 ± 7.68E-4, -7.85E-5 ± 9.91E-5 respectively. Compared with manual operation, the time required was reduced from 5.3 h to 0.19 h and input error was reduced from 0.002 to 0. The automatic extraction of dosimetry data in DICOM RT format for batch patients, the SPSS compatible data exportation, quick analysis were achieved in this paper. The efficiency of clinical researches based on dosimetry data analysis of large number of patients will be improved with this methods.

  12. Towards the automatic detection and analysis of sunspot rotation

    Science.gov (United States)

    Brown, Daniel S.; Walker, Andrew P.

    2016-10-01

    Torsional rotation of sunspots have been noted by many authors over the past century. Sunspots have been observed to rotate up to the order of 200 degrees over 8-10 days, and these have often been linked with eruptive behaviour such as solar flares and coronal mass ejections. However, most studies in the literature are case studies or small-number studies which suffer from selection bias. In order to better understand sunspot rotation and its impact on the corona, unbiased large-sample statistical studies are required (including both rotating and non-rotating sunspots). While this can be done manually, a better approach is to automate the detection and analysis of rotating sunspots using robust methods with well characterised uncertainties. The SDO/HMI instrument provide long-duration, high-resolution and high-cadence continuum observations suitable for extracting a large number of examples of rotating sunspots. This presentation will outline the analysis of SDI/HMI data to determine the rotation (and non-rotation) profiles of sunspots for the complete duration of their transit across the solar disk, along with how this can be extended to automatically identify sunspots and initiate their analysis.

  13. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  14. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  15. Automatic analysis of the 2015 Gorkha earthquake aftershock sequence.

    Science.gov (United States)

    Baillard, C.; Lyon-Caen, H.; Bollinger, L.; Rietbrock, A.; Letort, J.; Adhikari, L. B.

    2016-12-01

    The Mw 7.8 Gorkha earthquake, that partially ruptured the Main Himalayan Thrust North of Kathmandu on the 25th April 2015, was the largest and most catastrophic earthquake striking Nepal since the great M8.4 1934 earthquake. This mainshock was followed by multiple aftershocks, among them, two notable events that occurred on the 12th May with magnitudes of 7.3 Mw and 6.3 Mw. Due to these recent events it became essential for the authorities and for the scientific community to better evaluate the seismic risk in the region through a detailed analysis of the earthquake catalog, amongst others, the spatio-temporal distribution of the Gorkha aftershock sequence. Here we complement this first study by doing a microseismic study using seismic data coming from the eastern part of the Nepalese Seismological Center network associated to one broadband station in Everest. Our primary goal is to deliver an accurate catalog of the aftershock sequence. Due to the exceptional number of events detected we performed an automatic picking/locating procedure which can be splitted in 4 steps: 1) Coarse picking of the onsets using a classical STA/LTA picker, 2) phase association of picked onsets to detect and declare seismic events, 3) Kurtosis pick refinement around theoretical arrival times to increase picking and location accuracy and, 4) local magnitude calculation based amplitude of waveforms. This procedure is time efficient ( 1 sec/event), reduces considerably the location uncertainties ( 2 to 5 km errors) and increases the number of events detected compared to manual processing. Indeed, the automatic detection rate is 10 times higher than the manual detection rate. By comparing to the USGS catalog we were able to give a new attenuation law to compute local magnitudes in the region. A detailed analysis of the seismicity shows a clear migration toward the east of the region and a sudden decrease of seismicity 100 km east of Kathmandu which may reveal the presence of a tectonic

  16. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  17. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  18. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    Rocca, H.C.

    1976-10-01

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author) [es

  19. Automatic analysis of ciliary beat frequency using optical flow

    Science.gov (United States)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  20. Automatic telangiectasia analysis in dermoscopy images using adaptive critic design.

    Science.gov (United States)

    Cheng, B; Stanley, R J; Stoecker, W V; Hinton, K

    2012-11-01

    Telangiectasia, tiny skin vessels, are important dermoscopy structures used to discriminate basal cell carcinoma (BCC) from benign skin lesions. This research builds off of previously developed image analysis techniques to identify vessels automatically to discriminate benign lesions from BCCs. A biologically inspired reinforcement learning approach is investigated in an adaptive critic design framework to apply action-dependent heuristic dynamic programming (ADHDP) for discrimination based on computed features using different skin lesion contrast variations to promote the discrimination process. Lesion discrimination results for ADHDP are compared with multilayer perception backpropagation artificial neural networks. This study uses a data set of 498 dermoscopy skin lesion images of 263 BCCs and 226 competitive benign images as the input sets. This data set is extended from previous research [Cheng et al., Skin Research and Technology, 2011, 17: 278]. Experimental results yielded a diagnostic accuracy as high as 84.6% using the ADHDP approach, providing an 8.03% improvement over a standard multilayer perception method. We have chosen BCC detection rather than vessel detection as the endpoint. Although vessel detection is inherently easier, BCC detection has potential direct clinical applications. Small BCCs are detectable early by dermoscopy and potentially detectable by the automated methods described in this research. © 2011 John Wiley & Sons A/S.

  1. Automatic adventitious respiratory sound analysis: A systematic review.

    Directory of Open Access Journals (Sweden)

    Renard Xaviero Adhi Pramono

    Full Text Available Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD, and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established.To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works.A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016 and IEEExplore (1984-2016 databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification.Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated.Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved.A total of 77 reports from the literature were included in this review. 55 (71.43% of the studies focused on wheeze, 40 (51.95% on crackle, 9 (11.69% on stridor, 9

  2. Development of an automatic prompt gamma-ray activation analysis system

    International Nuclear Information System (INIS)

    Osawa, Takahito

    2013-01-01

    An automatic prompt gamma-ray activation analysis system was developed and installed at the Japan Research Reactor No. 3 Modified (JRR-3M). The main control software, referred to as AutoPGA, was developed using LabVIEW 2011 and the hand-made program can control all functions of the analytical system. The core of the new system is an automatic sample exchanger and measurement system with several additional automatic control functions integrated into the system. Up to fourteen samples can be automatically measured by the system. (author)

  3. CURRENT STATE ANALYSIS OF AUTOMATIC BLOCK SYSTEM DEVICES, METHODS OF ITS SERVICE AND MONITORING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2014-01-01

    Full Text Available Purpose. Development of formalized description of automatic block system of numerical code based on the analysis of characteristic failures of automatic block system and procedure of its maintenance. Methodology. For this research a theoretical and analytical methods have been used. Findings. Typical failures of the automatic block systems were analyzed, as well as basic reasons of failure occur were found out. It was determined that majority of failures occurs due to defects of the maintenance system. Advantages and disadvantages of the current service technology of automatic block system were analyzed. Works that can be automatized by means of technical diagnostics were found out. Formal description of the numerical code of automatic block system as a graph in the state space of the system was carried out. Originality. The state graph of the numerical code of automatic block system that takes into account gradual transition from the serviceable condition to the loss of efficiency was offered. That allows selecting diagnostic information according to attributes and increasing the effectiveness of recovery operations in the case of a malfunction. Practical value. The obtained results of analysis and proposed the state graph can be used as the basis for the development of new means of diagnosing devices for automatic block system, which in turn will improve the efficiency and service of automatic block system devices in general.

  4. The analysis of slag and silicate samples with a fully automatic sequential x-ray spectrometer

    International Nuclear Information System (INIS)

    Austen, C.E.

    1976-01-01

    The application of a fully automatic Philips PW 1220 X-ray spectrometer to the analysis of slag and silicate materials is described. The controlling software, written in BASIC and the operational instructions for the automatic spectrometer as applied in this report are available on request

  5. Background approximation in automatic qualitative X-ray-fluorescent analysis

    International Nuclear Information System (INIS)

    Jordanov, J.; Tsanov, T.; Stefanov, R.; Jordanov, N.; Paunov, M.

    1982-01-01

    An empirical method of finding the dependence of the background intensity (Isub(bg) on the wavelength is proposed, based on the approximation of the experimentally found values for the background in the course of an automatic qualitative X-ray fluorescent analysis with pre-set curve. It is assumed that the dependence I(lambda) will be well approximated by a curve of the type Isub(bg)=(lambda-lambda sub(o)sup(fsub(1)(lambda))exp[fsub(2)(lambda)] where fsub(1) (lambda) and f 2 (lambda) are linear functions with respect to the sought parameters. This assumption was checked out on a ''pure'' starch background, in which it is not known beforehand which points belong to the background. It was assumed that the dependence I(lambda) can be found from all minima in the spectrum. Three types of minima has been distinguished: 1. the lowest point between two well-solved X-ray lines; 2. a minimum obtained as a result of statistical fluctuations of the measured signal; 3. the lowest point between two overlapped lines. The minima strongly deviating from the background are removed from the obtained set. The sum-total of the remaining minima serves as a base for the approximation of the dependence I(lambda). The unknown parameters are determined by means of the LSM. The approximated curve obtained by this method is closer to the real background than the background determined by the method described by Kigaki Denki, as the effect of all recorded minima is taken into account. As an example the PbTe spectrum recorded with crystal LiF 220 is shown graphically. The curve well describes the background of the spectrum even in the regions in which there are no minima belonging to the background. (authors)

  6. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  7. Automatic thematic content analysis: finding frames in news

    NARCIS (Netherlands)

    Odijk, D.; Burscher, B.; Vliegenthart, R.; de Rijke, M.; Jatowt, A.; Lim, E.P.; Ding, Y.; Miura, A.; Tezuka, T.; Dias, G.; Tanaka, K.; Flanagin, A.; Dai, B.T.

    2013-01-01

    Framing in news is the way in which journalists depict an issue in terms of a ‘central organizing idea.’ Frames can be a perspective on an issue. We explore the automatic classification of four generic news frames: conflict, human interest, economic consequences, and morality. Complex

  8. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per

    2016-01-01

    We present a fully automatic online video system, which is able to detect the behaviour of honeybees at the beehive entrance. Our monitoring system focuses on observing the honeybees as naturally as possible (i.e. without disturbing the honeybees). It is based on the Raspberry Pi that is a low...

  9. Automatic acquisition and shape analysis of metastable peaks

    International Nuclear Information System (INIS)

    Maendli, H.; Robbiani, R.; Kuster, Th.; Seibl, J.

    1979-01-01

    A method for automatic acquisition and evaluation of metastable peaks due to transitions in the first field-free region of a double focussing mass spectrometer is presented. The data are acquired by computer-controlled repetitive scanning of the accelerating voltage and concomitant accumulation, the evaluation made by a mathematical derivatization of the resulting curve. Examples for application of the method are given. (Auth.)

  10. The Safeguards analysis applied to the RRP. Automatic sampling authentication system

    International Nuclear Information System (INIS)

    Ono, Sawako; Nakashima, Shinichi; Iwamoto, Tomonori

    2004-01-01

    The sampling for analysis from vessels and columns at the Rokkasho Reprocessing Plant (RRP) is performed mostly by the automatic sampling system. The safeguards sample for the verification also will be taken using these sampling systems and transfer to the OSL though the pneumatic transfer network owned and controlled by operator. In order to maintaining sample integrity and continuity of knowledge (CoK) for throughout the sample processing. It is essential to develop and establish the authentication measures for the automatic sampling system including transfer network. We have developed the Automatic Sampling Authentication System (ASAS) under consultation by IAEA. This paper describes structure, function and concept of ASAS. (author)

  11. Automatic particle-size analysis of HTGR recycle fuel

    International Nuclear Information System (INIS)

    Mack, J.E.; Pechin, W.H.

    1977-09-01

    An automatic particle-size analyzer was designed, fabricated, tested, and put into operation measuring and counting HTGR recycle fuel particles. The particle-size analyzer can be used for particles in all stages of fabrication, from the loaded, uncarbonized weak acid resin up to fully-coated Biso or Triso particles. The device handles microspheres in the range of 300 to 1000 μm at rates up to 2000 per minute, measuring the diameter of each particle to determine the size distribution of the sample, and simultaneously determining the total number of particles. 10 figures

  12. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  13. Analysis of the oscillation causes in automatic controller of reactor power

    International Nuclear Information System (INIS)

    Aleksakov, A.N.; Nikolaev, E.V.; Podlazov, L.N.

    1991-01-01

    Conditions for occurence of oscillations in automatic controller of reactor power are determined. Graphic-analytical method for calculating the stability of non-linear system, which enables one to reveal the most important factors determining the stability, is used. The practical results of the analysis are obtained for the system of local automatic comtrollers, used in the RBMK reactors. A simple method providing for the required stability margin, is suggested

  14. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  15. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    Science.gov (United States)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  16. Analysis of automobile’s automatic control systems for the hill climbing start

    Directory of Open Access Journals (Sweden)

    Valeriy I. Klimenko

    2014-12-01

    Full Text Available To improve road safety while driving on the rise, facilitating the driver’s activity the automobile industry leaders are introducing automatic hill-hold control systems into the car design. This study purpose relates to the existing automatic start control systems’ design analysis. Analyzed are the existing design developments of automatic hill start assist control systems applied for driving at the start of the climbing. The effected research allows to select the scheme for further development of start driving automatic control systems. Further improvement of driving control systems and primarily the driver assistance hill-hold control systems is necessary to increase both the driving comfort and the traffic safety.

  17. Automatic computer aided analysis algorithms and system for adrenal tumors on CT images.

    Science.gov (United States)

    Chai, Hanchao; Guo, Yi; Wang, Yuanyuan; Zhou, Guohui

    2017-12-04

    The adrenal tumor will disturb the secreting function of adrenocortical cells, leading to many diseases. Different kinds of adrenal tumors require different therapeutic schedules. In the practical diagnosis, it highly relies on the doctor's experience to judge the tumor type by reading the hundreds of CT images. This paper proposed an automatic computer aided analysis method for adrenal tumors detection and classification. It consisted of the automatic segmentation algorithms, the feature extraction and the classification algorithms. These algorithms were then integrated into a system and conducted on the graphic interface by using MATLAB Graphic user interface (GUI). The accuracy of the automatic computer aided segmentation and classification reached 90% on 436 CT images. The experiments proved the stability and reliability of this automatic computer aided analytic system.

  18. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  19. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  20. The Avenging Females: A Comparative Analysis of Kill Bill Vol.1-2, Death Proof and Sympathy for Lady Vengeance

    Directory of Open Access Journals (Sweden)

    Basak Göksel Demiray

    2012-04-01

    Full Text Available This paper provides a comparative analysis of Quentin Tarantino’s Kill Bill Vol.1-2 (2003, 2004, Death Proof (2007 and Park Chan Wook’s Sympathy for Lady Vengeance (Chinjeolhan Geumjassi, 2005. The primary objectives of this study are: (1 to reveal the gender-biases inherent to the fundamental discursive structures of the foregoing films; (2 to compare and contrast the films through an analysis of the ‘gaze(s’ and possible ‘pleasures’,  which are inherent in their narratives, in relation to Laura Mulvey’s and Carol Clover’s approaches; and (3 to distinguish Kill Bill Vol.1-2 from the foregoing two and the ‘avenging female’ clichés of the other horror/violence movies in the context of the replaced positionings of its protagonist and antagonist inherent in its distinct narrative style.

  1. Automatic chemical analysis of traces of boron in steel

    International Nuclear Information System (INIS)

    Ono, Akihiro; Yamaguchi, Naoharu; Matsumoto, Ryutaro

    1976-01-01

    The analyzer is composed of a sample changer, reagent addition devices, a distillation vessel, a color reaction vessel, a spectrophotometer, a controller, etc. The automatic procedure is performed according to the predetermined distillation and color reaction programs after dissolving 0.5 g of steel sample in aqua regia and fuming with sulfuric acid-phosphoric acid. The sample solution on the sample changer is transferred into the distillation vessel, where boron is distilled with methyl alcohol by heating and aeration. The distillate is collected in the distillate vessel, and a 1/2 aliquot is transferred into the color reaction vessel with small amounts of water. After the addition of glacial acetic acid and propionic anhydride, the distillate is circulated through the circulating pipe which is composed of an air blowing tube, a bubble remover, a flow cell and a drain valve. Oxalyl chloride (to eliminate water), sulfuric acid, the curcumin reagent (to form the boron complex) and an acetate buffer are added, and the absorbance of the solution is measured at 545 nm. The analytical results of steel samples were in good agreement with those obtained by the conventional method and with certified values. (auth.)

  2. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  3. Towards automatic music transcription: note extraction based on independent subspace analysis

    Science.gov (United States)

    Wellhausen, Jens; Hoynck, Michael

    2005-01-01

    Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.

  4. A full automatic system controlled with IBM-PC/XT micro-computer for neutron activation analysis

    International Nuclear Information System (INIS)

    Song Quanxun

    1992-01-01

    A full automatic system controlled with micro-computers for NAA is described. All processes are automatically completed with an IBM-PC/XT micro-computer. The device is stable, reliable, flexible and convenient for use and has many functions and applications in automatical analysis of long, middle and short lived nuclides. Due to a high working efficiency of the instrument and micro-computers, both time and power can be saved. This method can be applied in other nuclear analysis techniques

  5. Vol draadwerk

    African Journals Online (AJOL)

    Owner

    Die motto van Marius Crous se derde bundel,. Vol draadwerk (2012) is ontleen aan die vader van die psigoanalise, Sigmund Freud, wat lui: “Everywhere I go I find a poet has been there before me.” Vol draadwerk verskyn ses jaar ná sy vorige bundel, Aan 'n beentjie sit en kluif. (2006). Vir sy bundel, Brief uit die kolonies ...

  6. Automatic flow analysis of digital subtraction angiography using independent component analysis in patients with carotid stenosis.

    Directory of Open Access Journals (Sweden)

    Han-Jui Lee

    Full Text Available Current time-density curve analysis of digital subtraction angiography (DSA provides intravascular flow information but requires manual vasculature selection. We developed an angiographic marker that represents cerebral perfusion by using automatic independent component analysis.We retrospectively analyzed the data of 44 patients with unilateral carotid stenosis higher than 70% according to North American Symptomatic Carotid Endarterectomy Trial criteria. For all patients, magnetic resonance perfusion (MRP was performed one day before DSA. Fixed contrast injection protocols and DSA acquisition parameters were used before stenting. The cerebral circulation time (CCT was defined as the difference in the time to peak between the parietal vein and cavernous internal carotid artery in a lateral angiogram. Both anterior-posterior and lateral DSA views were processed using independent component analysis, and the capillary angiogram was extracted automatically. The full width at half maximum of the time-density curve in the capillary phase in the anterior-posterior and lateral DSA views was defined as the angiographic mean transient time (aMTT; i.e., aMTTAP and aMTTLat. The correlations between the degree of stenosis, CCT, aMTTAP and aMTTLat, and MRP parameters were evaluated.The degree of stenosis showed no correlation with CCT, aMTTAP, aMTTLat, or any MRP parameter. CCT showed a strong correlation with aMTTAP (r = 0.67 and aMTTLat (r = 0.72. Among the MRP parameters, CCT showed only a moderate correlation with MTT (r = 0.67 and Tmax (r = 0.40. aMTTAP showed a moderate correlation with Tmax (r = 0.42 and a strong correlation with MTT (r = 0.77. aMTTLat also showed similar correlations with Tmax (r = 0.59 and MTT (r = 0.73.Apart from vascular anatomy, aMTT estimates brain parenchyma hemodynamics from DSA and is concordant with MRP. This process is completely automatic and provides immediate measurement of quantitative peritherapeutic brain parenchyma

  7. ATC Operations Analysis via Automatic Recognition of Clearances, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools for analysis of Air Traffic Control (ATC) operations, such as the Surface...

  8. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    Science.gov (United States)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  9. Automatic morphometry of synaptic boutons of cultured cells using granulometric analysis of digital images

    NARCIS (Netherlands)

    Prodanov, D.P.; Heeroma, Joost; Marani, Enrico

    2006-01-01

    Numbers, linear density, and surface area of synaptic boutons can be important parameters in studies on synaptic plasticity in cultured neurons. We present a method for automatic identification and morphometry of boutons based on filtering of digital images using granulometric analysis. Cultures of

  10. Automatic analysis of children’s engagement using interactional network features

    NARCIS (Netherlands)

    Kim, Jaebok; Truong, Khiet Phuong

    We explored the automatic analysis of vocal non-verbal cues of a group of children in the context of engagement and collaborative play. For the current study, we defined two types of engagement on groups of children: harmonised and unharmonised. A spontaneous audiovisual corpus with groups of

  11. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    Science.gov (United States)

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  12. Function analysis of automatic power controller in Tianwan Nuclear Power Plant

    International Nuclear Information System (INIS)

    Zhou Haixiang

    2006-01-01

    The hardware and software design characters of the automatic power controller (APC) in Tianwan NPP are introduced. According to the analysis of the principle and function of the system, both availability and reliability of APC in Tianwan NPP are proved. The paper also supplies some methods on how to design the digital APC system. (authors)

  13. System for automatic x-ray-image analysis, measurement, and sorting of laser fusion targets

    International Nuclear Information System (INIS)

    Singleton, R.M.; Perkins, D.E.; Willenborg, D.L.

    1980-01-01

    This paper describes the Automatic X-Ray Image Analysis and Sorting (AXIAS) system which is designed to analyze and measure x-ray images of opaque hollow microspheres used as laser fusion targets. The x-ray images are first recorded on a high resolution film plate. The AXIAS system then digitizes and processes the images to accurately measure the target parameters and defects. The primary goals of the AXIAS system are: to provide extremely accurate and rapid measurements, to engineer a practical system for a routine production environment and to furnish the capability of automatically measuring an array of images for sorting and selection

  14. A microprocessor based picture analysis system for automatic track measurements

    International Nuclear Information System (INIS)

    Heinrich, W.; Trakowski, W.; Beer, J.; Schucht, R.

    1982-01-01

    In the last few years picture analysis became a powerful technique for measurements of nuclear tracks in plastic detectors. For this purpose rather expensive commercial systems are available. Two inexpensive microprocessor based systems with different resolution were developed. The video pictures of particles seen through a microscope are digitized in real time and the picture analysis is done by software. The microscopes are equipped with stages driven by stepping motors, which are controlled by separate microprocessors. A PDP 11/03 supervises the operation of all microprocessors and stores the measured data on its mass storage devices. (author)

  15. Capacity analysis of automatic transport systems in an assembly factory

    NARCIS (Netherlands)

    Zijm, W.H.M.; Lenstra, J.K.; Tijms, H.C.; Volgenant, A.

    1989-01-01

    We describe a case study concerning the capacity analysis of a completely automated transport system in a flexible assembly environment. Basically, the system is modelled as a network of queues, however, due to its complex nature, product-form network theory is not applicable. Instead, we present an

  16. Automatic settlement analysis of single-layer armour layers

    NARCIS (Netherlands)

    Hofland, B.; van gent, Marcel

    2016-01-01

    A method to quantify, analyse, and present the settlement of single-layer concrete armour layers of coastal structures is presented. The use of the image processing technique for settlement analysis is discussed based on various modelling
    studies performed over the years. The accuracy of the

  17. Learning Enterprise Malware Triage from Automatic Dynamic Analysis

    Science.gov (United States)

    2013-03-01

    Kolter and Maloof n-gram method, Dube’s malware target recognition (MaTR) static method performs significantly more accurately at the 95% confidence...from the static method as in Kolter and Maloof. The MIST approach with behavior sequences 9 allows researchers to tailor the level of analysis to the...citations, none publish work that implements it. Only Kolter and Maloof use nearly as long gram structures, although that research uses static grams rather

  18. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  19. Rhodium in car exhaust tips by total automatic activation analysis

    International Nuclear Information System (INIS)

    Grass, F.; Westphal, G.P.; Lemmel, H.; Sterba, J.

    2007-01-01

    Exhaust systems of modern cars contain catalysts for the reduction of CO, NO x and hydrocarbons. These catalysts are made of ceramic materials with a large surface on which platinum metals catalyse the oxidation. The catalysts contain approximately 2 g of platinum and 0.4 g of rhodium. Recently platinum is being replaced by palladium. During driving the platinum-group elements (PGEs) are expelled from the tip in fine particles and are deposited in the environment. For a projected study of emissions from cars driven on streets and highways it is important to know which elements can be measured by short time activation analysis without any chemical procedure. (author)

  20. Automatic forensic analysis of automotive paints using optical microscopy.

    Science.gov (United States)

    Thoonen, Guy; Nys, Bart; Vander Haeghen, Yves; De Roy, Gilbert; Scheunders, Paul

    2016-02-01

    The timely identification of vehicles involved in an accident, such as a hit-and-run situation, bears great importance in forensics. To this end, procedures have been defined for analyzing car paint samples that combine techniques such as visual analysis and Fourier transform infrared spectroscopy. This work proposes a new methodology in order to automate the visual analysis using image retrieval. Specifically, color and texture information is extracted from a microscopic image of a recovered paint sample, and this information is then compared with the same features for a database of paint types, resulting in a shortlist of candidate paints. In order to demonstrate the operation of the methodology, a test database has been set up and two retrieval experiments have been performed. The first experiment quantifies the performance of the procedure for retrieving exact matches, while the second experiment emulates the real-life situation of paint samples that experience changes in color and texture over time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Reliability analysis of the automatic control of the A-1 power plant coolant temperature

    International Nuclear Information System (INIS)

    Kuklik, B.; Semerad, V.; Chylek, Z.

    Reliability analysis of the automatic control of the A-1 reactor coolant temperature is performed taking into account the effect of both the dependent failures and the routine maintenance of control system components. In a separate supplement, reliability analysis is reported of coincidence systems of the A-1 power plant reactor. Both safe and unsafe failures are taken into consideration as well as the effect of maintenance of the respective branch elements

  2. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    Bruschi, R.; Di Porto, P.; Pallottelli, R.

    1985-01-01

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  3. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  4. Automatic localization of cerebral cortical malformations using fractal analysis.

    Science.gov (United States)

    De Luca, A; Arrigoni, F; Romaniello, R; Triulzi, F M; Peruzzo, D; Bertoldo, A

    2016-08-21

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  5. Automatic localization of cerebral cortical malformations using fractal analysis

    Science.gov (United States)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  6. Towards automatic analysis of dynamic radionuclide studies using principal-components factor analysis

    International Nuclear Information System (INIS)

    Nigran, K.S.; Barber, D.C.

    1985-01-01

    A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99 Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)

  7. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  8. Semi-supervised learning based probabilistic latent semantic analysis for automatic image annotation

    Institute of Scientific and Technical Information of China (English)

    Tian Dongping

    2017-01-01

    In recent years, multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas, especially for automatic image annotation, whose purpose is to provide an efficient and effective searching environment for users to query their images more easily.In this paper, a semi-supervised learning based probabilistic latent semantic analysis ( PL-SA) model for automatic image annotation is presenred.Since it' s often hard to obtain or create la-beled images in large quantities while unlabeled ones are easier to collect, a transductive support vector machine ( TSVM) is exploited to enhance the quality of the training image data.Then, differ-ent image features with different magnitudes will result in different performance for automatic image annotation.To this end, a Gaussian normalization method is utilized to normalize different features extracted from effective image regions segmented by the normalized cuts algorithm so as to reserve the intrinsic content of images as complete as possible.Finally, a PLSA model with asymmetric mo-dalities is constructed based on the expectation maximization( EM) algorithm to predict a candidate set of annotations with confidence scores.Extensive experiments on the general-purpose Corel5k dataset demonstrate that the proposed model can significantly improve performance of traditional PL-SA for the task of automatic image annotation.

  9. Automatic supervision and fault detection of PV systems based on power losses analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chouder, A.; Silvestre, S. [Electronic Engineering Department, Universitat Politecnica de Catalunya, C/Jordi Girona 1-3, Campus Nord UPC, 08034 Barcelona (Spain)

    2010-10-15

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L{sub ct}) and miscellaneous capture losses (L{sub cm}). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R{sub C} and R{sub V}. Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally. (author)

  10. Automatic supervision and fault detection of PV systems based on power losses analysis

    International Nuclear Information System (INIS)

    Chouder, A.; Silvestre, S.

    2010-01-01

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L ct ) and miscellaneous capture losses (L cm ). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R C and R V . Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally.

  11. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    Science.gov (United States)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  12. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  13. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    ideas behind the analysis methods are described as well as the mathematics on which they are based and also how the methods are supported by computer tools. Some parts of the volume are theoretical while others are application oriented. The purpose of the volume is to teach the reader how to use......This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  14. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    OpenAIRE

    Fischer, Bernd; Knuth, Kevin; Hajian, Arsen; Schumann, Johann

    2004-01-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable buildin...

  15. The computer system of automatical microscope analysis of mines' individual dosimeters

    International Nuclear Information System (INIS)

    Zorawski, A.; Hawrynski, M.; Kluszczynski, D.

    1988-01-01

    The Institute of Occupational Medicine (IOM) carriers on routine investigations on miners' individual exposure to radon and its alpha radioactive daughters in Polish mines [1]. Evaluation of miners' exposure is based on automatic analysis of track detectors by computer SYSTEM RADON. The IOM used detectors of size 2x3 cm cut from Kodak LT115 or LR115 dosimetry foil. The scheme of the system is presented in Fig.1 whereas Table 1 includes specification of its elements

  16. Technical characterization by image analysis: an automatic method of mineralogical studies

    International Nuclear Information System (INIS)

    Oliveira, J.F. de

    1988-01-01

    The application of a modern method of image analysis fully automated for the study of grain size distribution modal assays, degree of liberation and mineralogical associations is discussed. The image analyser is interfaced with a scanning electron microscope and an energy dispersive X-rays analyser. The image generated by backscattered electrons is analysed automatically and the system has been used in accessment studies of applied mineralogy as well as in process control in the mining industry. (author) [pt

  17. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Science.gov (United States)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  18. SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments

    Science.gov (United States)

    Leonard, R. F.

    1977-01-01

    A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.

  19. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    Science.gov (United States)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  20. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  1. Automatic analysis of image quality control for Image Guided Radiation Therapy (IGRT) devices in external radiotherapy

    International Nuclear Information System (INIS)

    Torfeh, Tarraf

    2009-01-01

    On-board imagers mounted on a radiotherapy treatment machine are very effective devices that improve the geometric accuracy of radiation delivery. However, a precise and regular quality control program is required in order to achieve this objective. Our purpose consisted of developing software tools dedicated to an automatic image quality control of IGRT devices used in external radiotherapy: 2D-MV mode for measuring patient position during the treatment using high energy images, 2D-kV mode (low energy images) and 3D Cone Beam Computed Tomography (CBCT) MV or kV mode, used for patient positioning before treatment. Automated analysis of the Winston and Lutz test was also proposed. This test is used for the evaluation of the mechanical aspects of treatment machines on which additional constraints are carried out due to the on-board imagers additional weights. Finally, a technique of generating digital phantoms in order to assess the performance of the proposed software tools is described. Software tools dedicated to an automatic quality control of IGRT devices allow reducing by a factor of 100 the time spent by the medical physics team to analyze the results of controls while improving their accuracy by using objective and reproducible analysis and offering traceability through generating automatic monitoring reports and statistical studies. (author) [fr

  2. Automatic Evaluation for E-Learning Using Latent Semantic Analysis: A Use Case

    Directory of Open Access Journals (Sweden)

    Mireia Farrús

    2013-03-01

    Full Text Available Assessment in education allows for obtaining, organizing, and presenting information about how much and how well the student is learning. The current paper aims at analysing and discussing some of the most state-of-the-art assessment systems in education. Later, this work presents a specific use case developed for the Universitat Oberta de Catalunya, which is an online university. An automatic evaluation tool is proposed that allows the student to evaluate himself anytime and receive instant feedback. This tool is a web-based platform, and it has been designed for engineering subjects (i.e., with math symbols and formulas in Catalan and Spanish. Particularly, the technique used for automatic assessment is latent semantic analysis. Although the experimental framework from the use case is quite challenging, results are promising.

  3. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT

    DEFF Research Database (Denmark)

    Murphy, Keelin; Pluim, Josien P. W.; Rikxoort, Eva M. van

    2012-01-01

    and its results; (b) verify that the quantitative, regional ventilation measurements acquired through CT are meaningful for pulmonary function analysis; (c) identify the most effective of the calculated measurements in predicting pulmonary function; and (d) demonstrate the potential of the system...... disorder). Lungs, fissures, airways, lobes, and vessels are automatically segmented in both scans and the expiration scan is registered with the inspiration scan using a fully automatic nonrigid registration algorithm. Segmentations and registrations are examined and scored by expert observers to analyze...... to have good correlation with spirometry results, with several having correlation coefficients, r, in the range of 0.85–0.90. The best performing kNN classifier succeeded in classifying 67% of subjects into the correct COPD GOLD stage, with a further 29% assigned to a class neighboring the correct one...

  4. Automatic progressive damage detection of rotor bar in induction motor using vibration analysis and multiple classifiers

    International Nuclear Information System (INIS)

    Cruz-Vega, Israel; Rangel-Magdaleno, Jose; Ramirez-Cortes, Juan; Peregrina-Barreto, Hayde

    2017-01-01

    There is an increased interest in developing reliable condition monitoring and fault diagnosis systems of machines like induction motors; such interest is not only in the final phase of the failure but also at early stages. In this paper, several levels of damage of rotor bars under different load conditions are identified by means of vibration signals. The importance of this work relies on a simple but effective automatic detection algorithm of the damage before a break occurs. The feature extraction is based on discrete wavelet analysis and auto- correlation process. Then, the automatic classification of the fault degree is carried out by a binary classification tree. In each node, com- paring the learned levels of the breaking off correctly identifies the fault degree. The best results of classification are obtained employing computational intelligence techniques like support vector machines, multilayer perceptron, and the k-NN algorithm, with a proper selection of their optimal parameters.

  5. Automatic progressive damage detection of rotor bar in induction motor using vibration analysis and multiple classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Cruz-Vega, Israel; Rangel-Magdaleno, Jose; Ramirez-Cortes, Juan; Peregrina-Barreto, Hayde [Santa María Tonantzintla, Puebla (Mexico)

    2017-06-15

    There is an increased interest in developing reliable condition monitoring and fault diagnosis systems of machines like induction motors; such interest is not only in the final phase of the failure but also at early stages. In this paper, several levels of damage of rotor bars under different load conditions are identified by means of vibration signals. The importance of this work relies on a simple but effective automatic detection algorithm of the damage before a break occurs. The feature extraction is based on discrete wavelet analysis and auto- correlation process. Then, the automatic classification of the fault degree is carried out by a binary classification tree. In each node, com- paring the learned levels of the breaking off correctly identifies the fault degree. The best results of classification are obtained employing computational intelligence techniques like support vector machines, multilayer perceptron, and the k-NN algorithm, with a proper selection of their optimal parameters.

  6. Identificação e quantificação de voláteis de café através de cromatografia gasosa de alta resolução / espectrometria de massas empregando um amostrador automático de "headspace" Identification and quantification of coffee volatile components through high resolution gas chromatoghaph/mass spectrometer using a headspace automatic sampler

    Directory of Open Access Journals (Sweden)

    Leonardo César AMSTALDEN

    2001-01-01

    Full Text Available Usando um amostrador automático, os "headspaces" de três marcas comerciais de café torrado e moído foram analisados qualitativa e quantitativamente quanto a composição dos voláteis responsáveis pelo aroma através da técnica de cromatografia gasosa/espectrometria de massas. Uma vez que a metodologia não envolveu isolamento ou concentração dos aromas, suas proporções naturais foram mantidas, além de simplificar o preparo das amostras. O emprego do amostrador automático permitiu também boa resolução dos picos cromatográficos sem o emprego de criogenia, contribuindo para redução no tempo de análise. Noventa e um componentes puderam ser identificados, sendo que alguns compostos conhecidos como presentes em café como o dimetilsulfeto, metional e furfuril mercaptana não foram detectados. Os voláteis presentes em maior concentração puderam ser quantificados com o auxílio de dois padrões internos. A técnica se provou viável, tanto para caracterização como para quantificação de voláteis de café.Employing an automatic headspace sampler, the headspaces of three commercial brands of ground roasted coffee were qualitatively and quantitatively analyzed by gas chromatography / mass spectrometry. Since the methodology did not involve aroma isolation or concentration, their natural proportions were maintained, providing a more accurate composition of the flavors, and simplifying sample preparation. The automatic sampler allowed good resolution of the chromatographic peaks without cryofocusing the samples at the head of the column during injection, reducing analysis time. Ninety one compounds were identified and some known coffee volatiles, such as dimethyl sulphide, methional and furfuryl mercaptan were not detected. The more concentrated volatiles could be identified using two internal standards. The technique proved viable, for both characterization and for quantification of coffee volatiles.

  7. Reliability analysis of the automatic control and power supply of reactor equipment

    International Nuclear Information System (INIS)

    Monori, Pal; Nagy, J.A.; Meszaros, Zoltan; Konkoly, Laszlo; Szabo, Antal; Nagy, Laszlo

    1988-01-01

    Based on reliability analysis the shortcomings of nuclear facilities are discovered. Fault tree types constructed for the technology of automatic control and for power supply serve as input data of the ORCHARD 2 computer code. In order to charaterize the reliability of the system, availability, failure rates and time intervals between failures are calculated. The results of the reliability analysis of the feedwater system of the Paks Nuclear Power Plant showed that the system consisted of elements of similar reliabilities. (V.N.) 8 figs.; 3 tabs

  8. Analysis of Social Variables when an Initial Functional Analysis Indicates Automatic Reinforcement as the Maintaining Variable for Self-Injurious Behavior

    Science.gov (United States)

    Kuhn, Stephanie A. Contrucci; Triggs, Mandy

    2009-01-01

    Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…

  9. Automatic isotope gas analysis of tritium labelled organic materials Pt. 1

    International Nuclear Information System (INIS)

    Gacs, I.; Mlinko, S.

    1978-01-01

    A new automatic procedure developed to convert tritium in HTO hydrogen for subsequent on-line gas counting is described. The water containing tritium is introduced into a column prepared from molecular sieve-5A and heated to 550 deg C. The tritium is transferred by isotopic exchange into hydrogen flowing through the column. The radioactive gas is led into an internal detector for radioactivity measurement. The procedure is free of memory effects, provides quantitative recovery with analytical reproducibility better than 0.5% rel. at a preset number of counts. The experimental and analytical results indicate that isotopic exchange between HTO and hydrogen over a column prepared from alumina or molecular sieve-5A can be successfully applied for the quantitative transfer of tritium from HTO into hydrogen for on-line gas countinq. This provides an analytical procedure for the automatic determination of tritium in water with an analytical reproducibility better than 0.5% rel. The exchange process will also be suitable for rapid tritium transfer from water formed during the decomposition of tritium-labelled organic compounds or biological materials. The application of the procedure in automatic isotope gas analysis of organic materials labelled with tritium will be described in subsequent papers (Parts II and III). (T.G.)

  10. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    Science.gov (United States)

    Fischer, Bernd; Hajian, Arsen; Knuth, Kevin; Schumann, Johann

    2004-04-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable building blocks for the algorithm construction; they can encapsulate advanced algorithms and data structures. A symbolic-algebraic system is used to find closed-form solutions for problems and emerging subproblems. In this paper, we describe the application of AUTOBAYES to the analysis of planetary nebulae images taken by the Hubble Space Telescope. We explain the system architecture, and present in detail the automatic derivation of the scientists' original analysis as well as a refined analysis using clustering models. This study demonstrates that AUTOBAYES is now mature enough so that it can be applied to realistic scientific data analysis tasks.

  11. volBrain: An Online MRI Brain Volumetry System

    Science.gov (United States)

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  12. volBrain: an online MRI brain volumetry system

    Directory of Open Access Journals (Sweden)

    Jose V. Manjon

    2016-07-01

    Full Text Available The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es, which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  13. volBrain: An Online MRI Brain Volumetry System.

    Science.gov (United States)

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  14. Fully automatic algorithm for the analysis of vessels in the angiographic image of the eye fundus

    Directory of Open Access Journals (Sweden)

    Koprowski Robert

    2012-06-01

    Full Text Available Abstract Background The available scientific literature contains descriptions of manual, semi-automated and automated methods for analysing angiographic images. The presented algorithms segment vessels calculating their tortuosity or number in a given area. We describe a statistical analysis of the inclination of the vessels in the fundus as related to their distance from the center of the optic disc. Methods The paper presents an automated method for analysing vessels which are found in angiographic images of the eye using a Matlab implemented algorithm. It performs filtration and convolution operations with suggested masks. The result is an image containing information on the location of vessels and their inclination angle in relation to the center of the optic disc. This is a new approach to the analysis of vessels whose usefulness has been confirmed in the diagnosis of hypertension. Results The proposed algorithm analyzed and processed the images of the eye fundus using a classifier in the form of decision trees. It enabled the proper classification of healthy patients and those with hypertension. The result is a very good separation of healthy subjects from the hypertensive ones: sensitivity - 83%, specificity - 100%, accuracy - 96%. This confirms a practical usefulness of the proposed method. Conclusions This paper presents an algorithm for the automatic analysis of morphological parameters of the fundus vessels. Such an analysis is performed during fluorescein angiography of the eye. The presented algorithm automatically calculates the global statistical features connected with both tortuosity of vessels and their total area or their number.

  15. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  16. Automatic analysis of digitized TV-images by a computer-driven optical microscope

    International Nuclear Information System (INIS)

    Rosa, G.; Di Bartolomeo, A.; Grella, G.; Romano, G.

    1997-01-01

    New methods of image analysis and three-dimensional pattern recognition were developed in order to perform the automatic scan of nuclear emulsion pellicles. An optical microscope, with a motorized stage, was equipped with a CCD camera and an image digitizer, and interfaced to a personal computer. Selected software routines inspired the design of a dedicated hardware processor. Fast operation, high efficiency and accuracy were achieved. First applications to high-energy physics experiments are reported. Further improvements are in progress, based on a high-resolution fast CCD camera and on programmable digital signal processors. Applications to other research fields are envisaged. (orig.)

  17. Porosity determination on pyrocarbon by means of automatic quantitative image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koizlik, K.; Uhlenbruck, U.; Delle, W.; Hoven, H.; Nickel, H.

    1976-05-01

    For a long time, the quantitative image analysis is well known as a method for quantifying the results of material investigation basing on ceramography. The development of the automatic image analyzers has made it a fast and elegant procedure for evaluation. Since 1975, it is used in IRW to determine easily and routinely the macroporosity and by this the density of the pyrocarbon coatings of nuclear fuel particles. This report describes the definition of measuring parameters, the measuring procedure, the mathematical calculations, and first experimental and mathematical results.

  18. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Science.gov (United States)

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Multidirectional channeling analysis of epitaxial CdTe layers using an automatic RBS/channeling system

    Energy Technology Data Exchange (ETDEWEB)

    Wielunski, L S; Kenny, M J [CSIRO, Lindfield, NSW (Australia). Applied Physics Div.

    1994-12-31

    Rutherford Backscattering Spectrometry (RBS) is an ion beam analysis technique used in many fields. The high depth and mass resolution of RBS make this technique very useful in semiconductor material analysis [1]. The use of ion channeling in combination with RBS creates a powerful technique which can provide information about crystal quality and structure in addition to mass and depth resolution [2]. The presence of crystal defects such as interstitial atoms, dislocations or dislocation loops can be detected and profiled [3,4]. Semiconductor materials such as CdTe, HgTe and Hg+xCd{sub 1-x}Te generate considerable interest due to applications as infrared detectors in many technological areas. The present paper demonstrates how automatic RBS and multidirectional channeling analysis can be used to evaluate crystal quality and near surface defects. 6 refs., 1 fig.

  20. An Exploitability Analysis Technique for Binary Vulnerability Based on Automatic Exception Suppression

    Directory of Open Access Journals (Sweden)

    Zhiyuan Jiang

    2018-01-01

    Full Text Available To quickly verify and fix vulnerabilities, it is necessary to judge the exploitability of the massive crash generated by the automated vulnerability mining tool. While the current manual analysis of the crash process is inefficient and time-consuming, the existing automated tools can only handle execute exceptions and some write exceptions but cannot handle common read exceptions. To address this problem, we propose a method of determining the exploitability based on the exception type suppression. This method enables the program to continue to execute until an exploitable exception is triggered. The method performs a symbolic replay of the crash sample, constructing and reusing data gadget, to bypass the complex exception, thereby improving the efficiency and accuracy of vulnerability exploitability analysis. The testing of typical CGC/RHG binary software shows that this method can automatically convert a crash that cannot be judged by existing analysis tools into a different crash type and judge the exploitability successfully.

  1. Multidirectional channeling analysis of epitaxial CdTe layers using an automatic RBS/channeling system

    Energy Technology Data Exchange (ETDEWEB)

    Wielunski, L.S.; Kenny, M.J. [CSIRO, Lindfield, NSW (Australia). Applied Physics Div.

    1993-12-31

    Rutherford Backscattering Spectrometry (RBS) is an ion beam analysis technique used in many fields. The high depth and mass resolution of RBS make this technique very useful in semiconductor material analysis [1]. The use of ion channeling in combination with RBS creates a powerful technique which can provide information about crystal quality and structure in addition to mass and depth resolution [2]. The presence of crystal defects such as interstitial atoms, dislocations or dislocation loops can be detected and profiled [3,4]. Semiconductor materials such as CdTe, HgTe and Hg+xCd{sub 1-x}Te generate considerable interest due to applications as infrared detectors in many technological areas. The present paper demonstrates how automatic RBS and multidirectional channeling analysis can be used to evaluate crystal quality and near surface defects. 6 refs., 1 fig.

  2. Development and installation of an automatic sample changer for neutron activation analysis

    International Nuclear Information System (INIS)

    Domienikan, Claudio; Lapolli, Andre L.; Schoueri, Roberto M.; Moreira, Edson G.; Vasconcellos, Marina B.A.

    2013-01-01

    A Programmable and Automatic Sample Changer was built and installed at the Neutron Activation Analysis Laboratory of the Nuclear and Energy Research Institute - IPEN-CNEN/SP, Brazil. This Automatic Sample Changer allows the fully automated measurement of up to 25 samples in one run. Basically it consists of an electronic circuit and C++ program that controls the positioning of a sample holder in two axes of motion (X and Y). Each sample is transported and positioned, one by one, inside the shielding coupled to a high-purity germanium (HPGe) radiation detector. A Canberra DSA-1000 Multichannel Analyzer coupled to the Genie 2000 software performs the data acquisition for analysis of the samples. When the counting is finished the results are saved in a hard disk of a PC computer. The sample is brought back by the sample holder to its initial position, and the next sample is carried to the shielding. The Sample Changer was designed and constructed at IPEN-CNEN/SP by employing national components and expertise. (author)

  3. Automatic Pedestrian Crossing Detection and Impairment Analysis Based on Mobile Mapping System

    Science.gov (United States)

    Liu, X.; Zhang, Y.; Li, Q.

    2017-09-01

    Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians' lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  4. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    Science.gov (United States)

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  5. AUTOMATIC PEDESTRIAN CROSSING DETECTION AND IMPAIRMENT ANALYSIS BASED ON MOBILE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    X. Liu

    2017-09-01

    Full Text Available Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians’ lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  6. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    Albrecht, R.W.; Crowe, R.D.; McGuire, J.J.

    1978-01-01

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  7. Analysis of facial expressions in parkinson's disease through video-based automatic methods.

    Science.gov (United States)

    Bandini, Andrea; Orlandi, Silvia; Escalante, Hugo Jair; Giovannelli, Fabio; Cincotta, Massimo; Reyes-Garcia, Carlos A; Vanni, Paola; Zaccara, Gaetano; Manfredi, Claudia

    2017-04-01

    The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. Results show that control subjects reported on average higher distances than PD patients along the tasks. This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Automatic analysis of force distribution in multi-fingered articulated hands

    Science.gov (United States)

    Amani, Mahyar

    A kinematic and force analysis of multifingered, multijointed dexterous hands is presented. The analysis involves calculation of the basis for null space of the grasp matrix. An algorithm is developed to perform all necessary analyses of dexterous hands. All equality and inequality constraints are automatically formulated in a constraint matrix. These include unisense constraints, frictional constraints, and joint torque limits. The reduction of the constraint matrix into a compact form is also presented. The algorithm is written in Fortran and is user-friendly and capable of handling all practical grasping problems. Different types of contact may be examined, including soft finger contact and point contact with and without friction. The software can easily determine the feasibility of a grasp for a given set of contact forces/torques, and may potentially be combined with either routines to compute the optimum value of contact wrench intensities and joint torque values.

  9. Automatic scatter detection in fluorescence landscapes by means of spherical principal component analysis

    DEFF Research Database (Denmark)

    Kotwa, Ewelina Katarzyna; Jørgensen, Bo Munk; Brockhoff, Per B.

    2013-01-01

    In this paper, we introduce a new method, based on spherical principal component analysis (S‐PCA), for the identification of Rayleigh and Raman scatters in fluorescence excitation–emission data. These scatters should be found and eliminated as a prestep before fitting parallel factor analysis...... models to the data, in order to avoid model degeneracies. The work is inspired and based on a previous research, where scatter removal was automatic (based on a robust version of PCA called ROBPCA) and required no visual data inspection but appeared to be computationally intensive. To overcome...... this drawback, we implement the fast S‐PCA in the scatter identification routine. Moreover, an additional pattern interpolation step that complements the method, based on robust regression, will be applied. In this way, substantial time savings are gained, and the user's engagement is restricted to a minimum...

  10. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    Science.gov (United States)

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Automatic analysis algorithm for radionuclide pulse-height data from beta-gamma coincidence systems

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.

    2001-01-01

    There are two acceptable noble gas monitoring measurement modes for Comprehensive Nuclear-Test-Ban-Treaty (CTBT) verification purposes defined in CTBT/PC/II/WG.B/1. These include beta-gamma coincidence and high-resolution gamma-spectrometry. There are at present no commercial, off-the-shelf (COTS) applications for the analysis of β-γ coincidence data. Development of such software is in progress at the Prototype International Data Centre (PIDC) for eventual deployment at the International Data Centre (IDC). Flowcharts detailing the automatic analysis algorithm for β-γ coincidence data to be coded at the PIDC is included. The program is being written in C with Oracle databasing capabilities. (author)

  12. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    Science.gov (United States)

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-08

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented.

  13. Automatic Segmenting Structures in MRI's Based on Texture Analysis and Fuzzy Logic

    Science.gov (United States)

    Kaur, Mandeep; Rattan, Munish; Singh, Pushpinder

    2017-12-01

    The purpose of this paper is to present the variational method for geometric contours which helps the level set function remain close to the sign distance function, therefor it remove the need of expensive re-initialization procedure and thus, level set method is applied on magnetic resonance images (MRI) to track the irregularities in them as medical imaging plays a substantial part in the treatment, therapy and diagnosis of various organs, tumors and various abnormalities. It favors the patient with more speedy and decisive disease controlling with lesser side effects. The geometrical shape, the tumor's size and tissue's abnormal growth can be calculated by the segmentation of that particular image. It is still a great challenge for the researchers to tackle with an automatic segmentation in the medical imaging. Based on the texture analysis, different images are processed by optimization of level set segmentation. Traditionally, optimization was manual for every image where each parameter is selected one after another. By applying fuzzy logic, the segmentation of image is correlated based on texture features, to make it automatic and more effective. There is no initialization of parameters and it works like an intelligent system. It segments the different MRI images without tuning the level set parameters and give optimized results for all MRI's.

  14. Semi-automatic system for UV images analysis of historical musical instruments

    Science.gov (United States)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  15. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  16. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    Science.gov (United States)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  17. AUTOMATIC FAULT RECOGNITION OF PHOTOVOLTAIC MODULES BASED ON STATISTICAL ANALYSIS OF UAV THERMOGRAPHY

    Directory of Open Access Journals (Sweden)

    D. Kim

    2017-08-01

    Full Text Available As a malfunctioning PV (Photovoltaic cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle. The proposed algorithm uses statistical analysis of thermal intensity (surface temperature characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  18. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  19. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    Science.gov (United States)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  20. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  1. Automatic analysis of flow cytometric DNA histograms from irradiated mouse male germ cells

    International Nuclear Information System (INIS)

    Lampariello, F.; Mauro, F.; Uccelli, R.; Spano, M.

    1989-01-01

    An automatic procedure for recovering the DNA content distribution of mouse irradiated testis cells from flow cytometric histograms is presented. First, a suitable mathematical model is developed, to represent the pattern of DNA content and fluorescence distribution in the sample. Then a parameter estimation procedure, based on the maximum likelihood approach, is constructed by means of an optimization technique. This procedure has been applied to a set of DNA histograms relative to different doses of 0.4-MeV neutrons and to different time intervals after irradiation. In each case, a good agreement between the measured histograms and the corresponding fits has been obtained. The results indicate that the proposed method for the quantitative analysis of germ cell DNA histograms can be usefully applied to the study of the cytotoxic and mutagenic action of agents of toxicological interest such as ionizing radiations.18 references

  2. Comparative Analysis of Automatic Exudate Detection between Machine Learning and Traditional Approaches

    Science.gov (United States)

    Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas

    To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.

  3. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  4. Analysis of steranes and triterpanes in geolipid extracts by automatic classification of mass spectra

    Science.gov (United States)

    Wardroper, A. M. K.; Brooks, P. W.; Humberston, M. J.; Maxwell, J. R.

    1977-01-01

    A computer method is described for the automatic classification of triterpanes and steranes into gross structural type from their mass spectral characteristics. The method has been applied to the spectra obtained by gas-chromatographic/mass-spectroscopic analysis of two mixtures of standards and of hydrocarbon fractions isolated from Green River and Messel oil shales. Almost all of the steranes and triterpanes identified previously in both shales were classified, in addition to a number of new components. The results indicate that classification of such alkanes is possible with a laboratory computer system. The method has application to diagenesis and maturation studies as well as to oil/oil and oil/source rock correlations in which rapid screening of large numbers of samples is required.

  5. Control-oriented Automatic System for Transport Analysis (ASTRA)-Matlab integration for Tokamaks

    International Nuclear Information System (INIS)

    Sevillano, M.G.; Garrido, I.; Garrido, A.J.

    2011-01-01

    The exponential growth in energy consumption has led to a renewed interest in the development of alternatives to fossil fuels. Between the unconventional resources that may help to meet this energy demand, nuclear fusion has arisen as a promising source, which has given way to an unprecedented interest in solving the different control problems existing in nuclear fusion reactors such as Tokamaks. The aim of this manuscript is to show how one of the most popular codes used to simulate the performance of Tokamaks, the Automatic System For Transport Analysis (ASTRA) code, can be integrated into the Matlab-Simulink tool in order to make easier and more comfortable the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed ASTRA-Matlab integration, a modified anti-windup Proportional Integral Derivative (PID)-based controller for the loop voltage of a Tokamak has been implemented. The integration achieved represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized and widely extended ASTRA transport code for Tokamaks. -- Highlights: → The paper presents a useful tool for rapid prototyping of different solutions to deal with the control problems arising in Tokamaks. → The proposed tool embeds the standardized Automatic System For Transport Analysis (ASTRA) code for Tokamaks within the well-known Matlab-Simulink software. → This allows testing and combining diverse control schemes in a unified way considering the ASTRA as the plant of the system. → A demonstrative Proportional Integral Derivative (PID)-based case study is provided to show the feasibility and capabilities of the proposed integration.

  6. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    Science.gov (United States)

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  7. Modelling of nuclear power plant control and instrumentation elements for automatic disturbance and reliability analysis

    International Nuclear Information System (INIS)

    Hollo, E.

    1985-08-01

    Present Final Report summarizes results of R/D work done within IAEA-VEIKI (Institute for Electrical Power Research, Budapest, Hungary) Research Contract No. 3210 during 3 years' period of 01.08.1982 - 31.08.1985. Chapter 1 lists main research objectives of the project. Main results obtained are summarized in Chapters 2 and 3. Outcomes from development of failure modelling methodologies and their application for C/I components of WWER-440 units are as follows (Chapter 2): improvement of available ''failure mode and effect analysis'' methods and mini-fault tree structures usable for automatic disturbance (DAS) and reliability (RAS) analysis; general classification and determination of functional failure modes of WWER-440 NPP C/I components; set up of logic models for motor operated control valves and rod control/drive mechanism. Results of development of methods and their application for reliability modelling of NPP components and systems cover (Chapter 3): development of an algorithm (computer code COMPREL) for component-related failure and reliability parameter calculation; reliability analysis of PAKS II NPP diesel system; definition of functional requirements for reliability data bank (RDB) in WWER-440 units. Determination of RDB input/output data structure and data manipulation services. Methods used are a-priori failure mode and effect analysis, combined fault tree/event tree modelling technique, structural computer programming, probability theory application to nuclear field

  8. Automatic facial pore analysis system using multi-scale pore detection.

    Science.gov (United States)

    Sun, J Y; Kim, S W; Lee, S H; Choi, J E; Ko, S J

    2017-08-01

    As facial pore widening and its treatments have become common concerns in the beauty care field, the necessity for an objective pore-analyzing system has been increased. Conventional apparatuses lack in usability requiring strong light sources and a cumbersome photographing process, and they often yield unsatisfactory analysis results. This study was conducted to develop an image processing technique for automatic facial pore analysis. The proposed method detects facial pores using multi-scale detection and optimal scale selection scheme and then extracts pore-related features such as total area, average size, depth, and the number of pores. Facial photographs of 50 subjects were graded by two expert dermatologists, and correlation analyses between the features and clinical grading were conducted. We also compared our analysis result with those of conventional pore-analyzing devices. The number of large pores and the average pore size were highly correlated with the severity of pore enlargement. In comparison with the conventional devices, the proposed analysis system achieved better performance showing stronger correlation with the clinical grading. The proposed system is highly accurate and reliable for measuring the severity of skin pore enlargement. It can be suitably used for objective assessment of the pore tightening treatments. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Automatic T1 bladder tumor detection by using wavelet analysis in cystoscopy images

    Science.gov (United States)

    Freitas, Nuno R.; Vieira, Pedro M.; Lima, Estevão; Lima, Carlos S.

    2018-02-01

    Correct classification of cystoscopy images depends on the interpreter’s experience. Bladder cancer is a common lesion that can only be confirmed by biopsying the tissue, therefore, the automatic identification of tumors plays a significant role in early stage diagnosis and its accuracy. To our best knowledge, the use of white light cystoscopy images for bladder tumor diagnosis has not been reported so far. In this paper, a texture analysis based approach is proposed for bladder tumor diagnosis presuming that tumors change in tissue texture. As is well accepted by the scientific community, texture information is more present in the medium to high frequency range which can be selected by using a discrete wavelet transform (DWT). Tumor enhancement can be improved by using automatic segmentation, since a mixing with normal tissue is avoided under ideal conditions. The segmentation module proposed in this paper takes advantage of the wavelet decomposition tree to discard poor texture information in such a way that both steps of the proposed algorithm segmentation and classification share the same focus on texture. Multilayer perceptron and a support vector machine with a stratified ten-fold cross-validation procedure were used for classification purposes by using the hue-saturation-value (HSV), red-green-blue, and CIELab color spaces. Performances of 91% in sensitivity and 92.9% in specificity were obtained regarding HSV color by using both preprocessing and classification steps based on the DWT. The proposed method can achieve good performance on identifying bladder tumor frames. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis.

  10. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  11. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    International Nuclear Information System (INIS)

    Sharifi, Hamid; Larouche, Daniel

    2015-01-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium–copper alloy (Al–5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie–Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected. (paper)

  12. A computer program for automatic gamma-ray spectra analysis with isotope identification for the purpose of activation analysis

    International Nuclear Information System (INIS)

    Weigel, H.; Dauk, J.

    1974-01-01

    A FORTRAN IV program for a PDP-9 computer, with 16K storage capacity, is developed performing automatic analysis of complex gamma-spectra, taken with Ge/Li/ detectors. It searches for full energy peaks and evaluates the peak areas. The program features and automatically performed isotope identifiaction. It is written in such a flexible manner that after reactor irradiation, spectra from samples of any composition can be evaluated for activation analysis. The peak search rutin is based on the following criteria: the counting rate has to increase for two succesive channels; and the amplitude of the corresponding maximum has to be greater than/or equal to F 1 times the statistical error of the counting rate in the valley just before the maximum. In order to detect superimposed peaks, it is assumed that the dependence of FWHM on channel number is roughly approximated by a linear function, and the actual and''theoretical''FWHM values are compared. To determine the net peak area a Gaussian based function is fitted to each peak. The isotope identification is based on the procedure developed by ADAMS and DAMS. (T.G.)

  13. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  14. Fractal Analysis of Elastographic Images for Automatic Detection of Diffuse Diseases of Salivary Glands: Preliminary Results

    Directory of Open Access Journals (Sweden)

    Alexandru Florin Badea

    2013-01-01

    Full Text Available The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD. It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of “real-time” elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology.

  15. Control characteristics and heating performance analysis of automatic thermostatic valves for radiant slab heating system in residential apartments

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Byung-Cheon [Department of Building Equipment System Engineering, Kyungwon University, Seongnam City (Korea); Song, Jae-Yeob [Graduate School, Building Equipment System Engineering, Kyungwon University, Seongnam City (Korea)

    2010-04-15

    Computer simulations and experiments are carried out to research the control characteristics and heating performances for a radiant slab heating system with automatic thermostatic valves in residential apartments. An electrical equivalent R-C circuit is applied to analyze the unsteady heat transfer in the house. In addition, the radiant heat transfer between slabs, ceilings and walls in the room is evaluated by enclosure analysis method. Results of heating performance and control characteristics were determined from control methods such as automatic thermostatic valves, room air temperature-sensing method, water-temperature-sensing method, proportional control method, and On-Off control method. (author)

  16. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  17. ANALYSIS OF RELIABILITY OF RESERVED AUTOMATIC CONTROL SYSTEMS OF INDUSTRIAL POWER PROCESSES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2014-01-01

    Full Text Available This paper describes the comparative analysis of the main structural schemes for reserved automatic control and regulation devices of important objects of power supply with increased reliability requirements. There were analyzed schemes of passive and active doubling with control device, passive and active tripling, combined redundancy and majority redundancy according to schemes: “two from three” and “three from five”. On the results of calculations fulfilled there was made comparison of these schemes for ideal devices of built-in control and ideal majority elements. Scales of preferences of systems according to criterion of average time maximum and average probability of no-failure operation were built. These scales have variable character, depending on intervals in which there is a parameter obtained by multiplication of failure rate and time. The sequence of systems’ preferences is changing and is depending on each system failures and in moments of curves crossing of average probability of no-failure operation of systems. Analysis of calculation results showed the advantages of tripling systems and combined redundancy in reliability and this is achieved by a great amount of expenses for these systems creation. Under definite conditions the reliability of system of passive tripling is higher compared to system of active doubling. The majority schemes allow determining not only the full but also single (metrological failures. Boundary value of unreliability of built-in control device is determined, and this allows making a perfect choice between systems of active and passive redundancy.

  18. Balance of the uranium market. Contribution to an economic analysis. Vol. 2

    International Nuclear Information System (INIS)

    Ehsani, J.

    1983-11-01

    The second volume of this thesis on the economic analysis of the uranium market contrains all the technical decriptions concerning reactors and fuel cycle and also detailed results of the two models on uranium supply and demand [fr

  19. Automatic registration of multi-modal microscopy images for integrative analysis of prostate tissue sections

    International Nuclear Information System (INIS)

    Lippolis, Giuseppe; Edsjö, Anders; Helczynski, Leszek; Bjartell, Anders; Overgaard, Niels Chr

    2013-01-01

    Prostate cancer is one of the leading causes of cancer related deaths. For diagnosis, predicting the outcome of the disease, and for assessing potential new biomarkers, pathologists and researchers routinely analyze histological samples. Morphological and molecular information may be integrated by aligning microscopic histological images in a multiplex fashion. This process is usually time-consuming and results in intra- and inter-user variability. The aim of this study is to investigate the feasibility of using modern image analysis methods for automated alignment of microscopic images from differently stained adjacent paraffin sections from prostatic tissue specimens. Tissue samples, obtained from biopsy or radical prostatectomy, were sectioned and stained with either hematoxylin & eosin (H&E), immunohistochemistry for p63 and AMACR or Time Resolved Fluorescence (TRF) for androgen receptor (AR). Image pairs were aligned allowing for translation, rotation and scaling. The registration was performed automatically by first detecting landmarks in both images, using the scale invariant image transform (SIFT), followed by the well-known RANSAC protocol for finding point correspondences and finally aligned by Procrustes fit. The Registration results were evaluated using both visual and quantitative criteria as defined in the text. Three experiments were carried out. First, images of consecutive tissue sections stained with H&E and p63/AMACR were successfully aligned in 85 of 88 cases (96.6%). The failures occurred in 3 out of 13 cores with highly aggressive cancer (Gleason score ≥ 8). Second, TRF and H&E image pairs were aligned correctly in 103 out of 106 cases (97%). The third experiment considered the alignment of image pairs with the same staining (H&E) coming from a stack of 4 sections. The success rate for alignment dropped from 93.8% in adjacent sections to 22% for sections furthest away. The proposed method is both reliable and fast and therefore well suited

  20. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  1. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  2. The development of an automatic sample-changer and control instrumentation for isotope-source neutron-activation analysis

    International Nuclear Information System (INIS)

    Andeweg, A.H.; Watterson, J.I.W.

    1983-01-01

    An automatic sample-changer was developed at the Council for Mineral Technology for use in isotope-source neutron-activation analysis. Tests show that the sample-changer can transfer a sample of up to 3 kg in mass over a distance of 3 m within 5 s. In addition, instrumentation in the form of a three-stage sequential timer was developed to control the sequence of irradiation transfer and analysis

  3. The interlaboratory experiment IDA-72 on mass spectrometric isotope dilution analysis. Vol. 2

    International Nuclear Information System (INIS)

    Beyrich, W.; Drosselmeyer, E.

    1975-07-01

    Volume II of the report on the IDA-72 experiment contains papers written by different authors on a number of special topics connected with the preparation, performance and evaluation of the interlaboratory test. In detail the sampling procedures for active samples of the reprocessing plant and the preparation of inactive reference and spike solution from standard material are described as well as new methods of sample conditioning by evaporation. An extra chapter is devoted to the chemical sample treatment as a preparation for mass spectrometric analysis of the U and Pu content of the solutions. Special topics are also methods for mass discrimination corrections, α-spectrometer measurements as a supplement for the determination of Pu-238 and the comparison of concentration determinations by mass spectrometric isotope dilution analysis with those performed by X-ray fluorescence spectrometry. The last part of this volume contains papers connected with the computerized statistical evaluation of the high number of data. (orig.) [de

  4. Quantitative analysis of retina layer elasticity based on automatic 3D segmentation (Conference Presentation)

    Science.gov (United States)

    He, Youmin; Qu, Yueqiao; Zhang, Yi; Ma, Teng; Zhu, Jiang; Miao, Yusi; Humayun, Mark; Zhou, Qifa; Chen, Zhongping

    2017-02-01

    Age-related macular degeneration (AMD) is an eye condition that is considered to be one of the leading causes of blindness among people over 50. Recent studies suggest that the mechanical properties in retina layers are affected during the early onset of disease. Therefore, it is necessary to identify such changes in the individual layers of the retina so as to provide useful information for disease diagnosis. In this study, we propose using an acoustic radiation force optical coherence elastography (ARF-OCE) system to dynamically excite the porcine retina and detect the vibrational displacement with phase resolved Doppler optical coherence tomography. Due to the vibrational mechanism of the tissue response, the image quality is compromised during elastogram acquisition. In order to properly analyze the images, all signals, including the trigger and control signals for excitation, as well as detection and scanning signals, are synchronized within the OCE software and are kept consistent between frames, making it possible for easy phase unwrapping and elasticity analysis. In addition, a combination of segmentation algorithms is used to accommodate the compromised image quality. An automatic 3D segmentation method has been developed to isolate and measure the relative elasticity of every individual retinal layer. Two different segmentation schemes based on random walker and dynamic programming are implemented. The algorithm has been validated using a 3D region of the porcine retina, where individual layers have been isolated and analyzed using statistical methods. The errors compared to manual segmentation will be calculated.

  5. Automatic identification of mobile and rigid substructures in molecular dynamics simulations and fractional structural fluctuation analysis.

    Directory of Open Access Journals (Sweden)

    Leandro Martínez

    Full Text Available The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD and Root Mean Square Fluctuations (RMSF of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.

  6. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis.

    Science.gov (United States)

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  7. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Directory of Open Access Journals (Sweden)

    Christian Held

    2013-01-01

    Full Text Available Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline′s modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  8. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  9. Sensitivity Analysis Based SVM Application on Automatic Incident Detection of Rural Road in China

    Directory of Open Access Journals (Sweden)

    Xingliang Liu

    2018-01-01

    Full Text Available Traditional automatic incident detection methods such as artificial neural networks, backpropagation neural network, and Markov chains are not suitable for addressing the incident detection problem of rural roads in China which have a relatively high accident rate and a low reaction speed caused by the character of small traffic volume. This study applies the support vector machine (SVM and parameter sensitivity analysis methods to build an accident detection algorithm in a rural road condition, based on real-time data collected in a field experiment. The sensitivity of four parameters (speed, front distance, vehicle group time interval, and free driving ratio is analyzed, and the data sets of two parameters with a significant sensitivity are chosen to form the traffic state feature vector. The SVM and k-fold cross validation (K-CV methods are used to build the accident detection algorithm, which shows an excellent performance in detection accuracy (98.15% of the training data set and 87.5% of the testing data set. Therefore, the problem of low incident reaction speed of rural roads in China could be solved to some extent.

  10. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  11. Making computers noble. An experiment in automatic analysis of medieval texts

    Directory of Open Access Journals (Sweden)

    Andrea Colli

    2016-02-01

    Full Text Available L’analisi informatica di testi filosofici, la creazione di database, ipertesti o edizioni elettroniche non costituiscono più unicamente una ricerca di frontiera, ma sono da molti anni una risorsa preziosa per gli studi umanistici. Ora, non si tratta di richiedere alle macchine un ulteriore sforzo per comprendere il linguaggio umano, quanto piuttosto di perfezionare gli strumenti affinché esse possano essere a tutti gli effetti collaboratori di ricerca. Questo articolo è concepito come il resoconto di un esperimento finalizzato a documentare come le associazioni lessicali di un gruppo selezionato di testi medievali possa offrire qualche suggerimento in merito ai loro contenuti teorici. Computer analysis of texts, creation of databases hypertexts and digital editions are not the final frontier of research anymore. Quite the contrary, from many years they have been representing a significant contribution to medieval studies. Therefore, we do not mean to make the computer able to grasp the meaning of human language and penetrate its secrets, but rather we aim at improving their tools, so that they will become an even more efficient equipment employed in research activities. This paper is thought as a sort of technical report with the proposed task to verify if an automatic identification of some word associations within a selected groups of medieval writings produces suggestions on the subject of the processed texts, able to be used in a theoretical inquiry.

  12. Risk analysis of Leksell Gamma Knife Model C with automatic positioning system

    International Nuclear Information System (INIS)

    Goetsch, Steven J.

    2002-01-01

    Purpose: This study was conducted to evaluate the decrease in risk from misadministration of the new Leksell Gamma Knife Model C with Automatic Positioning System compared with previous models. Methods and Materials: Elekta Instruments, A.B. of Stockholm has introduced a new computer-controlled Leksell Gamma Knife Model C which uses motor-driven trunnions to reposition the patient between isocenters (shots) without human intervention. Previous models required the operators to manually set coordinates from a printed list, permitting opportunities for coordinate transposition, incorrect helmet size, incorrect treatment times, missing shots, or repeated shots. Results: A risk analysis was conducted between craniotomy involving hospital admission and outpatient Gamma Knife radiosurgery. A report of the Institute of Medicine of the National Academies dated November 29, 1999 estimated that medical errors kill between 44,000 and 98,000 people each year in the United States. Another report from the National Nosocomial Infections Surveillance System estimates that 2.1 million nosocomial infections occur annually in the United States in acute care hospitals alone, with 31 million total admissions. Conclusions: All medical procedures have attendant risks of morbidity and possibly mortality. Each patient should be counseled as to the risk of adverse effects as well as the likelihood of good results for alternative treatment strategies. This paper seeks to fill a gap in the existing medical literature, which has a paucity of data involving risk estimates for stereotactic radiosurgery

  13. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    Science.gov (United States)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  14. Automatic analysis and characterization of the hummingbird wings motion using dense optical flow features

    International Nuclear Information System (INIS)

    Martínez, Fabio; Romero, Eduardo; Manzanera, Antoine

    2015-01-01

    A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of 0.31 rad/frame 2 and 1.9 (rad/frame) 2 for the global angular acceleration and the global wing deformation respectively. (paper)

  15. System for the automatic analysis of defects in X-ray imaging

    International Nuclear Information System (INIS)

    Favier, C.; Thomas, G.; Brebant, C.; Mogavero, R.

    1984-05-01

    A radiological device was developed to obtain direct digitized views. A set of algorithms has been developed and demonstrated for the automatic evaluation of weldings. Some results concerning electronuclear fuel pin weldings are presented [fr

  16. Information manager-2011-Vol 11

    African Journals Online (AJOL)

    Library _info_Sc_ 1

    The Information Manager Vol. ... extent while carrying out their duties on a daily basis. ... It concluded by admonishing librarians to brace up and keep pace ... Proper training should be given to .... used to information and communication technologies .... proficiency Skills .... analysis of the workplace and implementation of.

  17. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  18. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 1, Basic Concepts

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets. These CP-nets are shown to be a full-fledged language for the design, specification, simulation, validation and implementation of large software systems. The introductory first...... volume contains the formal definition of CP-nets and the mathematical theory behind their analysis methods. It gives a detailed presentation of many small examples and a brief overview of some industrial applications. The purpose of the book is to teach the reader how to construct CP-net models...

  19. Balance of the uranium market. Contribution to an economic analysis. Vol. 1

    International Nuclear Information System (INIS)

    Ehsani, J.

    1983-11-01

    The evolution of the economic and the energetic world situation for these last 40 years are first described stressing electricity production. Place of nuclear energy in the energy market is analyzed taking into account socio-political factors. In a second part, cost for electricity production from different sources: coal, nuclear power, fuel-oil are compared. Two models for uranium world supply and demand are developed. In the third part the uranium market balance, is examined. The analysis includes 3 steps: determination of total uranium demand from 1983 to 2000, determination of total uranium supply by portion of production cost, supply and demand are compared for future evolution with different hypothesis [fr

  20. The interlaboratory experiment IDA-72 on mass spectrometric isotope dilution analysis. Vol. 1

    International Nuclear Information System (INIS)

    Beyrich, W.; Drosselmeyer, E.

    1975-07-01

    Within the framework of the Safeguards Project of the Federal Republic of Germany at the Nuclear Research Center Karlsruhe an analytical intercomparison program was carried out in cooperation with 22 laboratories of 13 countries or international organizations. The main objective was the acquisition of basic data on the errors involved in the mass spectrometric isotope dilution analysis if it is applied to the determination of uranium and plutonium in diluted active feed solutions of reprocessing plants in routine operation. The results were evaluated by statistical methods mainly in regard to the calculation of the estimates of the variances for the different error components contributing to the total error of this analytical technique. Furthermore, the performance of two new methods for sample conditioning suggested by the International Atomic Energy Agency, Vienna, and the European Institute for Transuranium Elements (EURATOM), Karlsruhe, was successfully tested. The results of some investigations on the stability of diluted high active feed solutions and on comparison analysis by X-ray fluorescence spectrometry are also included. Data on the analytical efforts (manhours) invested in this study are reported as well as general experiences made in the organization and performance of an experiment on such an extended international level. (orig.) [de

  1. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  2. Neutron flux calculations for criticality safety analysis using the narrow resonance approximations. Vol. 2

    Energy Technology Data Exchange (ETDEWEB)

    Hathout, A M [National Center for Nuclear Safety and Radiation Control, NC-NSRC, Atomic Energy Authority, Cairo (Egypt)

    1996-03-01

    The narrow resonance approximation is applicable for all low-energy resonances and the heaviest nuclides. It is of great importance in neutron calculations, hence, fertile isotopes do not undergo fission at resonance energies. The effect of overestimating the self shielded group averaged cross-section data for a given resonance nuclide can be fairly serious. In the present work, a detailed study, and derivation of the problem of self-shielding are carried-out through the information of Hansen-roach library which is used for criticality safety analysis. The intermediate neutron flux spectrum is analyzed, using the narrow resonance approximation. The resonance self-shielded values of various cross-sections are determined. 4 figs., 3 tabs.

  3. Automatic extraction of soft tissues from 3D MRI head images using model driven analysis

    International Nuclear Information System (INIS)

    Jiang, Hao; Yamamoto, Shinji; Imao, Masanao.

    1995-01-01

    This paper presents an automatic extraction system (called TOPS-3D : Top Down Parallel Pattern Recognition System for 3D Images) of soft tissues from 3D MRI head images by using model driven analysis algorithm. As the construction of system TOPS we developed, two concepts have been considered in the design of system TOPS-3D. One is the system having a hierarchical structure of reasoning using model information in higher level, and the other is a parallel image processing structure used to extract plural candidate regions for a destination entity. The new points of system TOPS-3D are as follows. (1) The TOPS-3D is a three-dimensional image analysis system including 3D model construction and 3D image processing techniques. (2) A technique is proposed to increase connectivity between knowledge processing in higher level and image processing in lower level. The technique is realized by applying opening operation of mathematical morphology, in which a structural model function defined in higher level by knowledge representation is immediately used to the filter function of opening operation as image processing in lower level. The system TOPS-3D applied to 3D MRI head images consists of three levels. First and second levels are reasoning part, and third level is image processing part. In experiments, we applied 5 samples of 3D MRI head images with size 128 x 128 x 128 pixels to the system TOPS-3D to extract the regions of soft tissues such as cerebrum, cerebellum and brain stem. From the experimental results, the system is robust for variation of input data by using model information, and the position and shape of soft tissues are extracted corresponding to anatomical structure. (author)

  4. Automatic Decision Support for Clinical Diagnostic Literature Using Link Analysis in a Weighted Keyword Network.

    Science.gov (United States)

    Li, Shuqing; Sun, Ying; Soergel, Dagobert

    2017-12-23

    We present a novel approach to recommending articles from the medical literature that support clinical diagnostic decision-making, giving detailed descriptions of the associated ideas and principles. The specific goal is to retrieve biomedical articles that help answer questions of a specified type about a particular case. Based on the filtered keywords, MeSH(Medical Subject Headings) lexicon and the automatically extracted acronyms, the relationship between keywords and articles was built. The paper gives a detailed description of the process of by which keywords were measured and relevant articles identified based on link analysis in a weighted keywords network. Some important challenges identified in this study include the extraction of diagnosis-related keywords and a collection of valid sentences based on the keyword co-occurrence analysis and existing descriptions of symptoms. All data were taken from medical articles provided in the TREC (Text Retrieval Conference) clinical decision support track 2015. Ten standard topics and one demonstration topic were tested. In each case, a maximum of five articles with the highest relevance were returned. The total user satisfaction of 3.98 was 33% higher than average. The results also suggested that the smaller the number of results, the higher the average satisfaction. However, a few shortcomings were also revealed since medical literature recommendation for clinical diagnostic decision support is so complex a topic that it cannot be fully addressed through the semantic information carried solely by keywords in existing descriptions of symptoms. Nevertheless, the fact that these articles are actually relevant will no doubt inspire future research.

  5. A locally designed mobile laboratory for radiation analysis and monitoring in qatar. Vol. 4

    International Nuclear Information System (INIS)

    Abou-Leila, H.; El-Samman, H.; Mahmoud, H.

    1996-01-01

    A description of a mobile laboratory for radiation analysis and monitoring, completely designed in qatar and equipped at qatar university, is given. It consists of a van equipped with three scintillation detectors mounted on the front bumper. The detectors can monitor gamma radiations along the path of the laboratory over an angle range 120 degree. One Eberline radiation monitoring station is mounted on the roof. The laboratory is also equipped with several, and neutron survey meters in addition to some sampling equipment. All equipment used are powered with solar panels. The characteristics and performance of solar power/stabilized A C conversion is given. Data acquisition from the three scintillation detectors is performed by adding the outputs of the three detectors and storing the total as a function of time in a computer based multi-channel analyzer (MCA) operated in the MSC mode. The acquisition can be switched easily to the PHA mode to analyze gamma spectra from any possible contamination source. The laboratory was used in several environmental and possible contamination missions. Some results obtained during some of these missions are given. 4 figs

  6. A locally designed mobile laboratory for radiation analysis and monitoring in qatar. Vol. 4

    Energy Technology Data Exchange (ETDEWEB)

    Abou-Leila, H; El-Samman, H; Mahmoud, H [Physics Department, University of qatar, Doha (Qatar)

    1996-03-01

    A description of a mobile laboratory for radiation analysis and monitoring, completely designed in qatar and equipped at qatar university, is given. It consists of a van equipped with three scintillation detectors mounted on the front bumper. The detectors can monitor gamma radiations along the path of the laboratory over an angle range 120 degree. One Eberline radiation monitoring station is mounted on the roof. The laboratory is also equipped with several, and neutron survey meters in addition to some sampling equipment. All equipment used are powered with solar panels. The characteristics and performance of solar power/stabilized A C conversion is given. Data acquisition from the three scintillation detectors is performed by adding the outputs of the three detectors and storing the total as a function of time in a computer based multi-channel analyzer (MCA) operated in the MSC mode. The acquisition can be switched easily to the PHA mode to analyze gamma spectra from any possible contamination source. The laboratory was used in several environmental and possible contamination missions. Some results obtained during some of these missions are given. 4 figs.

  7. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Boehme, E.; Musekamp, W.; Hussels, U.; Becker, G.; Behr, H.; Luettgert, H.

    1994-01-01

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP) [de

  8. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  9. Automatization of the neutron activation analysis method in the nuclear analysis laboratory

    International Nuclear Information System (INIS)

    Gonzalez, N.R.; Rivero, D del C.; Gonzalez, M.A.; Larramendi, F.

    1993-01-01

    In the present paper the work done to automatice the Neutron Activation Analysis technic with a neutron generator is described. An interface between an IBM compatible microcomputer and the equipment in use to make this kind of measurement was developed. including the specialized software for this system

  10. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    Science.gov (United States)

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  11. The Swiss-Army-Knife Approach to the Nearly Automatic Analysis for Microearthquake Sequences.

    Science.gov (United States)

    Kraft, T.; Simon, V.; Tormann, T.; Diehl, T.; Herrmann, M.

    2017-12-01

    Many Swiss earthquake sequence have been studied using relative location techniques, which often allowed to constrain the active fault planes and shed light on the tectonic processes that drove the seismicity. Yet, in the majority of cases the number of located earthquakes was too small to infer the details of the space-time evolution of the sequences, or their statistical properties. Therefore, it has mostly been impossible to resolve clear patterns in the seismicity of individual sequences, which are needed to improve our understanding of the mechanisms behind them. Here we present a nearly automatic workflow that combines well-established seismological analysis techniques and allows to significantly improve the completeness of detected and located earthquakes of a sequence. We start from the manually timed routine catalog of the Swiss Seismological Service (SED), which contains the larger events of a sequence. From these well-analyzed earthquakes we dynamically assemble a template set and perform a matched filter analysis on the station with: the best SNR for the sequence; and a recording history of at least 10-15 years, our typical analysis period. This usually allows us to detect events several orders of magnitude below the SED catalog detection threshold. The waveform similarity of the events is then further exploited to derive accurate and consistent magnitudes. The enhanced catalog is then analyzed statistically to derive high-resolution time-lines of the a- and b-value and consequently the occurrence probability of larger events. Many of the detected events are strong enough to be located using double-differences. No further manual interaction is needed; we simply time-shift the arrival-time pattern of the detecting template to the associated detection. Waveform similarity assures a good approximation of the expected arrival-times, which we use to calculate event-pair arrival-time differences by cross correlation. After a SNR and cycle-skipping quality

  12. Regulatory analysis for the resolution of Generic Issue 125.II.7 ''Reevaluate Provision to Automatically Isolate Feedwater from Steam Generator During a Line Break''

    International Nuclear Information System (INIS)

    Basdekas, D.L.

    1988-09-01

    Generic Issue 125.II.7 addresses the concern related to the automatic isolation of auxiliary feedwater (AFW) to a steam generator with a broken steam or feedwater line. This regulatory analysis provides a quantitative assessment of the costs and benefits associated with the removal of the AFW automatic isolation and concludes that no new regulatory requirements are warranted. 21 refs., 7 tabs

  13. Automatic detection of outlines. Application to the quantitative analysis of renal scintiscanning pictures

    International Nuclear Information System (INIS)

    Morcos-Ghrab, Nadia.

    1979-01-01

    The purpose of the work described is the finalizing of a method making it possible automatically to extract the significant outlines on a renal scintiscanning picture. The algorithms must be simple and of high performance, their routine execution on a mini-computer must be fast enough to compete effectively with human performances. However, the method that has been developed is general enough to be adapted, with slight modifications, to another type of picture. The first chapter is a brief introduction to the principle of scintiscanning, the equipment used and the type of picture obtained therefrom. In the second chapter the various approaches used for form recognition and scene analysis are very briefly described with the help of examples. The third chapter deals with pretreatment techniques (particularly the machine operators) used for segmenting the pictures. Chapter four presents techniques which segment the picture by parallel processing of all its points. In chapter five a description is given of the sequential research techniques of the outline elements, drawing inspiration from the methods used in artificial intelligence for resolving the optimization problem. The sixth chapter shows the difficulties encountered in extracting the renal outlines and the planning technique stages adopted to overcome these difficulties. Chapter seven describes in detail the two research methods employed for generating the plan. In chapter eight, the methods used for extending the areas obtained on the plan and for refining the outlines that bound them are dealt with. Chapter nine is a short presentation of the organization of the programmes and of their data structure. Finally, examples of results are given in chapter ten [fr

  14. An analysis of line-drawings based upon automatically inferred grammar and its application to chest x-ray images

    International Nuclear Information System (INIS)

    Nakayama, Akira; Yoshida, Yuuji; Fukumura, Teruo

    1984-01-01

    There is a technique using inferring grammer as image- structure analyzing technique. This technique involves a few problems if it is applied to naturally obtained images, as the practical grammatical technique for two-dimensional image is not established. The authors developed a technique which solved the above problems for the main purpose of the automated structure analysis of naturally obtained image. The first half of this paper describes on the automatic inference of line drawing generation grammar and the line drawing analysis based on that automatic inference. The second half of the paper reports on the actual analysis. The proposed technique is that to extract object line drawings out of the line drawings containing noise. The technique was evaluated for its effectiveness with an example of extracting rib center lines out of thin line chest X-ray images having practical scale and complexity. In this example, the total number of characteristic points (ends, branch points and intersections) composing line drawings per one image was 377, and the total number of line segments composing line drawings was 566 on average per sheet. The extraction ratio was 86.6 % which seemed to be proper when the complexity of input line drawings was considered. Further, the result was compared with the identified rib center lines with the automatic screening system AISCR-V3 for comparison with the conventional processing technique, and it was satisfactory when the versatility of this method was considered. (Wakatsuki, Y.)

  15. Development of user interface to support automatic program generation of nuclear power plant analysis by module-based simulation system

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Mizutani, Naoki; Nakaya, Ken-ichiro; Wakabayashi, Jiro

    1988-01-01

    Module-based Simulation System (MSS) has been developed to realize a new software work environment enabling versatile dynamic simulation of a complex nuclear power system flexibly. The MSS makes full use of modern software technology to replace a large fraction of human software works in complex, large-scale program development by computer automation. Fundamental methods utilized in MSS and developmental study on human interface system SESS-1 to help users in generating integrated simulation programs automatically are summarized as follows: (1) To enhance usability and 'communality' of program resources, the basic mathematical models of common usage in nuclear power plant analysis are programed as 'modules' and stored in a module library. The information on usage of individual modules are stored in module database with easy registration, update and retrieval by the interactive management system. (2) Target simulation programs and the input/output files are automatically generated with simple block-wise languages by a precompiler system for module integration purpose. (3) Working time for program development and analysis in an example study of an LMFBR plant thermal-hydraulic transient analysis was demonstrated to be remarkably shortened, with the introduction of an interface system SESS-1 developed as an automatic program generation environment. (author)

  16. Comparative Analysis of Music Recordings from Western and Non-Western traditions by Automatic Tonal Feature Extraction

    Directory of Open Access Journals (Sweden)

    Emilia Gómez

    2008-09-01

    Full Text Available The automatic analysis of large musical corpora by means of computational models overcomes some limitations of manual analysis, and the unavailability of scores for most existing music makes necessary to work with audio recordings. Until now, research on this area has focused on music from the Western tradition. Nevertheless, we might ask if the available methods are suitable when analyzing music from other cultures. We present an empirical approach to the comparative analysis of audio recordings, focusing on tonal features and data mining techniques. Tonal features are related to the pitch class distribution, pitch range and employed scale, gamut and tuning system. We provide our initial but promising results obtained when trying to automatically distinguish music from Western and non- Western traditions; we analyze which descriptors are most relevant and study their distribution over 1500 pieces from different traditions and styles. As a result, some feature distributions differ for Western and non-Western music, and the obtained classification accuracy is higher than 80% for different classification algorithms and an independent test set. These results show that automatic description of audio signals together with data mining techniques provide means to characterize huge music collections from different traditions and complement musicological manual analyses.

  17. ATMAD: robust image analysis for Automatic Tissue MicroArray De-arraying.

    Science.gov (United States)

    Nguyen, Hoai Nam; Paveau, Vincent; Cauchois, Cyril; Kervrann, Charles

    2018-04-19

    Over the last two decades, an innovative technology called Tissue Microarray (TMA), which combines multi-tissue and DNA microarray concepts, has been widely used in the field of histology. It consists of a collection of several (up to 1000 or more) tissue samples that are assembled onto a single support - typically a glass slide - according to a design grid (array) layout, in order to allow multiplex analysis by treating numerous samples under identical and standardized conditions. However, during the TMA manufacturing process, the sample positions can be highly distorted from the design grid due to the imprecision when assembling tissue samples and the deformation of the embedding waxes. Consequently, these distortions may lead to severe errors of (histological) assay results when the sample identities are mismatched between the design and its manufactured output. The development of a robust method for de-arraying TMA, which localizes and matches TMA samples with their design grid, is therefore crucial to overcome the bottleneck of this prominent technology. In this paper, we propose an Automatic, fast and robust TMA De-arraying (ATMAD) approach dedicated to images acquired with brightfield and fluorescence microscopes (or scanners). First, tissue samples are localized in the large image by applying a locally adaptive thresholding on the isotropic wavelet transform of the input TMA image. To reduce false detections, a parametric shape model is considered for segmenting ellipse-shaped objects at each detected position. Segmented objects that do not meet the size and the roundness criteria are discarded from the list of tissue samples before being matched with the design grid. Sample matching is performed by estimating the TMA grid deformation under the thin-plate model. Finally, thanks to the estimated deformation, the true tissue samples that were preliminary rejected in the early image processing step are recognized by running a second segmentation step. We

  18. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Rastin, S.J.; Unsworth, C.P.; Bennet, L.

    2010-01-01

    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  19. Analysis of individual classification of lameness using automatic measurement of back posture in dairy cattle

    NARCIS (Netherlands)

    Viazzi, S.; Schlageter Tello, A.A.; Hertem, van T.; Romanini, C.E.B.; Pluk, A.; Halachmi, I.; Lokhorst, C.; Berckmans, D.

    2013-01-01

    Currently, diagnosis of lameness at an early stage in dairy cows relies on visual observation by the farmer, which is time consuming and often omitted. Many studies have tried to develop automatic cow lameness detection systems. However, those studies apply thresholds to the whole population to

  20. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    Science.gov (United States)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  1. Analysis and synthesis of a system for optimal automatic regulation of the process of mechanical cutting by a combine

    Energy Technology Data Exchange (ETDEWEB)

    Pop, E.; Coroescu, T.; Poanta, A.; Pop, M.

    1978-01-01

    Uncontrollable dynamic operating regime of a combine has a negative effect. A consequence of the uncontrolled change in productivity and rate during cutting is total decrease in productivity. The cutters of the cutting mechanism are prematurely worn out. The quality of the coal decreases. Complications with combine control reduce productivity. The motor is exposed to the maximum loads, its service life decreases, and there is an inefficient consumption of electricity. Studies of the optimal automatic regulation of the cutting process were made by the method of modeled analysis on digital and analog machines. The method uses an electronic-automatic device with integrating circuit of domestic production (A-741, A-723). This device controls and regulates the current parameters of the acting motor. The device includes primarily an element of information type of the Hall TH traductor type, the regulating element is an electronic relay, electronic power distributor, etc.

  2. Automatic classification of retinal three-dimensional optical coherence tomography images using principal component analysis network with composite kernels.

    Science.gov (United States)

    Fang, Leyuan; Wang, Chong; Li, Shutao; Yan, Jun; Chen, Xiangdong; Rabbani, Hossein

    2017-11-01

    We present an automatic method, termed as the principal component analysis network with composite kernel (PCANet-CK), for the classification of three-dimensional (3-D) retinal optical coherence tomography (OCT) images. Specifically, the proposed PCANet-CK method first utilizes the PCANet to automatically learn features from each B-scan of the 3-D retinal OCT images. Then, multiple kernels are separately applied to a set of very important features of the B-scans and these kernels are fused together, which can jointly exploit the correlations among features of the 3-D OCT images. Finally, the fused (composite) kernel is incorporated into an extreme learning machine for the OCT image classification. We tested our proposed algorithm on two real 3-D spectral domain OCT (SD-OCT) datasets (of normal subjects and subjects with the macular edema and age-related macular degeneration), which demonstrated its effectiveness. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  3. Automatic teeth axes calculation for well-aligned teeth using cost profile analysis along teeth center arch.

    Science.gov (United States)

    Kim, Gyehyun; Lee, Jeongjin; Seo, Jinwook; Lee, Wooshik; Shin, Yeong-Gil; Kim, Bohyoung

    2012-04-01

    In dental implantology and virtual dental surgery planning using computed tomography (CT) images, the examination of the axes of neighboring and/or biting teeth is important to improve the performance of the masticatory system as well as the aesthetic beauty. However, due to its high connectivity to neighboring teeth and jawbones, a tooth and/or its axis is very elusive to automatically identify in dental CT images. This paper presents a novel method of automatically calculating individual teeth axes. The planes separating the individual teeth are automatically calculated using cost profile analysis along the teeth center arch. In this calculation, a novel plane cost function, which considers the intensity and the gradient, is proposed to favor the teeth separation planes crossing the teeth interstice and suppress the possible inappropriately detected separation planes crossing the soft pulp. The soft pulp and dentine of each individually separated tooth are then segmented by a fast marching method with two newly proposed speed functions considering their own specific anatomical characteristics. The axis of each tooth is finally calculated using principal component analysis on the segmented soft pulp and dentine. In experimental results using 20 clinical datasets, the average angle and minimum distance differences between the teeth axes manually specified by two dentists and automatically calculated by the proposed method were 1.94° ± 0.61° and 1.13 ± 0.56 mm, respectively. The proposed method identified the individual teeth axes accurately, demonstrating that it can give dentists substantial assistance during dental surgery such as dental implant placement and orthognathic surgery.

  4. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    Science.gov (United States)

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals.

  5. A novel automatic quantification method for high-content screening analysis of DNA double strand-break response.

    Science.gov (United States)

    Feng, Jingwen; Lin, Jie; Zhang, Pengquan; Yang, Songnan; Sa, Yu; Feng, Yuanming

    2017-08-29

    High-content screening is commonly used in studies of the DNA damage response. The double-strand break (DSB) is one of the most harmful types of DNA damage lesions. The conventional method used to quantify DSBs is γH2AX foci counting, which requires manual adjustment and preset parameters and is usually regarded as imprecise, time-consuming, poorly reproducible, and inaccurate. Therefore, a robust automatic alternative method is highly desired. In this manuscript, we present a new method for quantifying DSBs which involves automatic image cropping, automatic foci-segmentation and fluorescent intensity measurement. Furthermore, an additional function was added for standardizing the measurement of DSB response inhibition based on co-localization analysis. We tested the method with a well-known inhibitor of DSB response. The new method requires only one preset parameter, which effectively minimizes operator-dependent variations. Compared with conventional methods, the new method detected a higher percentage difference of foci formation between different cells, which can improve measurement accuracy. The effects of the inhibitor on DSB response were successfully quantified with the new method (p = 0.000). The advantages of this method in terms of reliability, automation and simplicity show its potential in quantitative fluorescence imaging studies and high-content screening for compounds and factors involved in DSB response.

  6. [Automated analysis of bacterial preparations manufactured on automatic heat fixation and staining equipment].

    Science.gov (United States)

    2012-01-01

    Heat fixation of preparations was made in the fixation bath designed by EMKO (Russia). Programmable "Emkosteiner" (EMKO, Russia) was used for trial staining. Reagents set Micko-GRAM-NITsF was applied for Gram's method of staining. It was demostrated that automatic smear fixation equipment and programmable staining ensure high-quality imaging (1% chromaticity variation) good enough for standardization of Gram's staining of microbial preparations.

  7. A novel automatic film changer for high-speed analysis of nuclear emulsions

    International Nuclear Information System (INIS)

    Borer, K.; Damet, J.; Hess, M.; Kreslo, I.; Moser, U.; Pretzl, K.; Savvinov, N.; Schuetz, H.-U.; Waelchli, T.; Weber, M.

    2006-01-01

    This paper describes the recent development of a novel automatic computer-controlled manipulator for emulsion sheet placement and removal at the microscope object table (also called stage). The manipulator is designed for mass scanning of emulsions for the OPERA neutrino oscillation experiment and provides emulsion changing time shorter than 30s with an emulsion sheet positioning accuracy as good as 20μm RMS

  8. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  9. Vibration analysis on automatic take-up device of belt conveyor

    Science.gov (United States)

    Qin, Tailong; Wei, Jin

    2008-10-01

    Through introducing application condition of belt conveyor in the modern mining industry, the paper proposed, in the dynamic course of its starting, braking or loading, it would produce moving tension and elastic wave. And analyzed the factors cause the automatic take-up device of belt conveyor vibrating: the take-up device's structure and the elastic wave. Finally the paper proposed the measure to reduce vibration and carried on the modeling and simulation on the tension buffer device.

  10. Automatic segmentation in three-dimensional analysis of fibrovascular pigmentepithelial detachment using high-definition optical coherence tomography.

    Science.gov (United States)

    Ahlers, C; Simader, C; Geitzenauer, W; Stock, G; Stetson, P; Dastmalchi, S; Schmidt-Erfurth, U

    2008-02-01

    A limited number of scans compromise conventional optical coherence tomography (OCT) to track chorioretinal disease in its full extension. Failures in edge-detection algorithms falsify the results of retinal mapping even further. High-definition-OCT (HD-OCT) is based on raster scanning and was used to visualise the localisation and volume of intra- and sub-pigment-epithelial (RPE) changes in fibrovascular pigment epithelial detachments (fPED). Two different scanning patterns were evaluated. 22 eyes with fPED were imaged using a frequency-domain, high-speed prototype of the Cirrus HD-OCT. The axial resolution was 6 mum, and the scanning speed was 25 kA scans/s. Two different scanning patterns covering an area of 6 x 6 mm in the macular retina were compared. Three-dimensional topographic reconstructions and volume calculations were performed using MATLAB-based automatic segmentation software. Detailed information about layer-specific distribution of fluid accumulation and volumetric measurements can be obtained for retinal- and sub-RPE volumes. Both raster scans show a high correlation (p0.89) of measured values, that is PED volume/area, retinal volume and mean retinal thickness. Quality control of the automatic segmentation revealed reasonable results in over 90% of the examinations. Automatic segmentation allows for detailed quantitative and topographic analysis of the RPE and the overlying retina. In fPED, the 128 x 512 scanning-pattern shows mild advantages when compared with the 256 x 256 scan. Together with the ability for automatic segmentation, HD-OCT clearly improves the clinical monitoring of chorioretinal disease by adding relevant new parameters. HD-OCT is likely capable of enhancing the understanding of pathophysiology and benefits of treatment for current anti-CNV strategies in future.

  11. Automatic Condition Monitoring of Industrial Rolling-Element Bearings Using Motor’s Vibration and Current Analysis

    DEFF Research Database (Denmark)

    Yang, Zhenyu

    2015-01-01

    An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated by the pot...... characteristic frequencies, sideband effects, time-average of spectra, and selection of fault index and thresholds, are also discussed. The experimental work shows a huge potential to use some simple methods for successful diagnosis of industrial bearing systems.......An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated...... is extensively studied under diverse operating conditions: different sensor locations, motor speeds, loading conditions, and data samples from different time segments. The experimental results showed the powerful capability of vibration analysis in the bearing point defect fault diagnosis. The current analysis...

  12. Elemental analysis of soil and plant samples at El-Manzala lake neutron activation analysis technique. Vol. 4

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron Physics Department, Nuclear Research Center, Atomic Energy Authority, Cairo (Egypt); Abd El-Haleem, A S [Radiactive Environment Pollution Department, Hot Laboratories Center, Nuclear Research Center, Atomic Energy Authority, Cairo (Egypt); El-Abbady, W H [Phsics Department, Faculty of Science , Al-Azhar University, (Egypt)

    1996-03-01

    Soil and plant samples were collected from from two locations, Bahr El-Baker, and Bahr kados at the manzala lake, where, where high pollution is expected. The samples were especially treated and prepared for investigation by thermal neutron activation analysis (NAA). The irradiation facilities of the first egyptian research reactor (ET-R R-1), and the hyper pure germanium (HPGe) detection system were used for analysis. Among the 34 identified Fe, Co, As, Cd, Te, La, Sm, Rb, Hg, Th, and U are of a special significance because of the their toxic deleterious impact on living organisms. This work is part of a research project concerning pollution studies on the river Nile and some lakes of egypt. The data obtained in the present work stands as a reference basic record for any future follow up of contamination level. 1 tab.

  13. Elemental analysis of soil and plant samples at El-Manzala lake neutron activation analysis technique. Vol. 4

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; Abd El-Haleem, A.S.; El-Abbady, W.H.

    1996-01-01

    Soil and plant samples were collected from from two locations, Bahr El-Baker, and Bahr kados at the manzala lake, where, where high pollution is expected. The samples were especially treated and prepared for investigation by thermal neutron activation analysis (NAA). The irradiation facilities of the first egyptian research reactor (ET-R R-1), and the hyper pure germanium (HPGe) detection system were used for analysis. Among the 34 identified Fe, Co, As, Cd, Te, La, Sm, Rb, Hg, Th, and U are of a special significance because of the their toxic deleterious impact on living organisms. This work is part of a research project concerning pollution studies on the river Nile and some lakes of egypt. The data obtained in the present work stands as a reference basic record for any future follow up of contamination level. 1 tab

  14. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    International Nuclear Information System (INIS)

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  15. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard, M.A.; Sommer, S.C. [Lawrence Livermore National Lab., CA (United States)

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  16. Texture analysis of automatic graph cuts segmentations for detection of lung cancer recurrence after stereotactic radiotherapy

    Science.gov (United States)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2015-03-01

    Stereotactic ablative radiotherapy (SABR) is a treatment for early-stage lung cancer with local control rates comparable to surgery. After SABR, benign radiation induced lung injury (RILI) results in tumour-mimicking changes on computed tomography (CT) imaging. Distinguishing recurrence from RILI is a critical clinical decision determining the need for potentially life-saving salvage therapies whose high risks in this population dictate their use only for true recurrences. Current approaches do not reliably detect recurrence within a year post-SABR. We measured the detection accuracy of texture features within automatically determined regions of interest, with the only operator input being the single line segment measuring tumour diameter, normally taken during the clinical workflow. Our leave-one-out cross validation on images taken 2-5 months post-SABR showed robustness of the entropy measure, with classification error of 26% and area under the receiver operating characteristic curve (AUC) of 0.77 using automatic segmentation; the results using manual segmentation were 24% and 0.75, respectively. AUCs for this feature increased to 0.82 and 0.93 at 8-14 months and 14-20 months post SABR, respectively, suggesting even better performance nearer to the date of clinical diagnosis of recurrence; thus this system could also be used to support and reinforce the physician's decision at that time. Based on our ongoing validation of this automatic approach on a larger sample, we aim to develop a computer-aided diagnosis system which will support the physician's decision to apply timely salvage therapies and prevent patients with RILI from undergoing invasive and risky procedures.

  17. Computational text analysis and reading comprehension exam complexity towards automatic text classification

    CERN Document Server

    Liontou, Trisevgeni

    2014-01-01

    This book delineates a range of linguistic features that characterise the reading texts used at the B2 (Independent User) and C1 (Proficient User) levels of the Greek State Certificate of English Language Proficiency exams in order to help define text difficulty per level of competence. In addition, it examines whether specific reader variables influence test takers' perceptions of reading comprehension difficulty. The end product is a Text Classification Profile per level of competence and a formula for automatically estimating text difficulty and assigning levels to texts consistently and re

  18. Design of Low Power Algorithms for Automatic Embedded Analysis of Patch ECG Signals

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt

    , several different cable-free wireless patch-type ECG recorders have recently reached the market. One of these recorders is the ePatch designed by the Danish company DELTA. The extended monitoring period available with the patch recorders has demonstrated to increase the diagnostic yield of outpatient ECG....... Such algorithms could allow the real-time transmission of clinically relevant information to a central monitoring station. The first step in embedded ECG interpretation is the automatic detection of each individual heartbeat. An important part of this project was therefore to design a novel algorithm...

  19. Development of new techniques and enhancement of automatic capability of neutron activation analysis at the Dalat Research Reactor

    International Nuclear Information System (INIS)

    Ho Manh Dung; Ho Van Doanh; Tran Quang Thien; Pham Ngoc Tuan; Pham Ngoc Son; Tran Quoc Duong; Nguyen Van Cuong; Nguyen Minh Tuan; Nguyen Giang; Nguyen Thi Sy

    2017-01-01

    The techniques of neutron activation analysis (NAA) including cyclic, epithermal and prompt-gamma (CNAA, ENAA and PGNAA, respectively) have been developed at the Dalat research reactor (DRR). In addition, the efforts has been spent to improve the automatic capability of irradiation, measurement and data processing of NAA. The renewal of necessary devices/tools for sample preparation have also been done. Eventually, the performance and the utility in terms of sensitivity, accuracy and stability of the analytical results generated by NAA at DRR have significantly been improved. The main results of the project are: 1) Upgrading of the fast irradiation system on Channel 13-2/TC to allow the cyclic irradiations; 2) Development of CNAA; 3) Development of ENAA; 4) Application of k0-method for PGNAA; 5) Investigation of the automatic sample changer (ASC2); 6) Upgrading of Ko-DALAT software for ENAA and modification of k0-IAEA software for CNAA and PGNAA; and 7) Optimization of irradiation and measurement facilities as well as sample preparation devices/tools. A set of procedures of relevant developed techniques in the project were established. The procedures have been evaluated by analysis of the reference materials for which they are meeting the requirements of multi-element analysis for the intended applications. (author)

  20. Automatic three-dimensional quantitative analysis for evaluation of facial movement.

    Science.gov (United States)

    Hontanilla, B; Aubá, C

    2008-01-01

    The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.

  1. Automatic identification of motion artifacts in EHG recording for robust analysis of uterine contractions.

    Science.gov (United States)

    Ye-Lin, Yiyao; Garcia-Casado, Javier; Prats-Boluda, Gema; Alberola-Rubio, José; Perales, Alfredo

    2014-01-01

    Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.

  2. Automatic Identification of Motion Artifacts in EHG Recording for Robust Analysis of Uterine Contractions

    Directory of Open Access Journals (Sweden)

    Yiyao Ye-Lin

    2014-01-01

    Full Text Available Electrohysterography (EHG is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.

  3. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    Science.gov (United States)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  4. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    Science.gov (United States)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  5. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    Science.gov (United States)

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  6. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    Directory of Open Access Journals (Sweden)

    Kemal Akyol

    2016-01-01

    Full Text Available With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  7. Automatic system for quantification and visualization of lung aeration on chest computed tomography images: the Lung Image System Analysis - LISA

    Energy Technology Data Exchange (ETDEWEB)

    Felix, John Hebert da Silva; Cortez, Paulo Cesar, E-mail: jhsfelix@gmail.co [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Dept. de Engenharia de Teleinformatica; Holanda, Marcelo Alcantara [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Hospital Universitario Walter Cantidio. Dept. de Medicina Clinica

    2010-12-15

    High Resolution Computed Tomography (HRCT) is the exam of choice for the diagnostic evaluation of lung parenchyma diseases. There is an increasing interest for computational systems able to automatically analyze the radiological densities of the lungs in CT images. The main objective of this study is to present a system for the automatic quantification and visualization of the lung aeration in HRCT images of different degrees of aeration, called Lung Image System Analysis (LISA). The secondary objective is to compare LISA to the Osiris system and also to specific algorithm lung segmentation (ALS), on the accuracy of the lungs segmentation. The LISA system automatically extracts the following image attributes: lungs perimeter, cross sectional area, volume, the radiological densities histograms, the mean lung density (MLD) in Hounsfield units (HU), the relative area of the lungs with voxels with density values lower than -950 HU (RA950) and the 15th percentile of the least density voxels (PERC15). Furthermore, LISA has a colored mask algorithm that applies pseudo-colors to the lung parenchyma according to the pre-defined radiological density chosen by the system user. The lungs segmentations of 102 images of 8 healthy volunteers and 141 images of 11 patients with Chronic Obstructive Pulmonary Disease (COPD) were compared on the accuracy and concordance among the three methods. The LISA was more effective on lungs segmentation than the other two methods. LISA's color mask tool improves the spatial visualization of the degrees of lung aeration and the various attributes of the image that can be extracted may help physicians and researchers to better assess lung aeration both quantitatively and qualitatively. LISA may have important clinical and research applications on the assessment of global and regional lung aeration and therefore deserves further developments and validation studies. (author)

  8. Automatic system for quantification and visualization of lung aeration on chest computed tomography images: the Lung Image System Analysis - LISA

    International Nuclear Information System (INIS)

    Felix, John Hebert da Silva; Cortez, Paulo Cesar; Holanda, Marcelo Alcantara

    2010-01-01

    High Resolution Computed Tomography (HRCT) is the exam of choice for the diagnostic evaluation of lung parenchyma diseases. There is an increasing interest for computational systems able to automatically analyze the radiological densities of the lungs in CT images. The main objective of this study is to present a system for the automatic quantification and visualization of the lung aeration in HRCT images of different degrees of aeration, called Lung Image System Analysis (LISA). The secondary objective is to compare LISA to the Osiris system and also to specific algorithm lung segmentation (ALS), on the accuracy of the lungs segmentation. The LISA system automatically extracts the following image attributes: lungs perimeter, cross sectional area, volume, the radiological densities histograms, the mean lung density (MLD) in Hounsfield units (HU), the relative area of the lungs with voxels with density values lower than -950 HU (RA950) and the 15th percentile of the least density voxels (PERC15). Furthermore, LISA has a colored mask algorithm that applies pseudo-colors to the lung parenchyma according to the pre-defined radiological density chosen by the system user. The lungs segmentations of 102 images of 8 healthy volunteers and 141 images of 11 patients with Chronic Obstructive Pulmonary Disease (COPD) were compared on the accuracy and concordance among the three methods. The LISA was more effective on lungs segmentation than the other two methods. LISA's color mask tool improves the spatial visualization of the degrees of lung aeration and the various attributes of the image that can be extracted may help physicians and researchers to better assess lung aeration both quantitatively and qualitatively. LISA may have important clinical and research applications on the assessment of global and regional lung aeration and therefore deserves further developments and validation studies. (author)

  9. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  10. Analysis of separation test for automatic brake adjuster based on linear radon transformation

    Science.gov (United States)

    Luo, Zai; Jiang, Wensong; Guo, Bin; Fan, Weijun; Lu, Yi

    2015-01-01

    The linear Radon transformation is applied to extract inflection points for online test system under the noise conditions. The linear Radon transformation has a strong ability of anti-noise and anti-interference by fitting the online test curve in several parts, which makes it easy to handle consecutive inflection points. We applied the linear Radon transformation to the separation test system to solve the separating clearance of automatic brake adjuster. The experimental results show that the feature point extraction error of the gradient maximum optimal method is approximately equal to ±0.100, while the feature point extraction error of linear Radon transformation method can reach to ±0.010, which has a lower error than the former one. In addition, the linear Radon transformation is robust.

  11. Automatic isotope gas analysis of tritium labelled organic materials Pt. 3

    International Nuclear Information System (INIS)

    Gacs, I.; Mlinko, S.; Payer, K.; Otvos, L.; Banfi, D.; Palagyi, T.

    1978-01-01

    An isotope analytical procedure and an automatic instrument developed for the determination of tritium in organic compounds and biological materials by internal gas counting are described. The sample is burnt in a stream of oxygen and the combustion products including water vapour carrying the tritium are led onto a column of molecular sieve-5A heated to 550 deg C. Tritium is retained temporarily on the column, then transferred into a stream of hydrogen by isotope exchange. After addition of butane, the tritiated hydrogen is led into an internal detector and enclosed there for radioactivity measurement. The procedure, providing quantitative recovery, is completed in five minutes. It is free of memory effect and suitable for the determination of tritium in a wide range of organic compounds and samples of biological origin. (author)

  12. Micron system for automatization and analysis of measurements in nuclear photoemulsion

    International Nuclear Information System (INIS)

    Dajon, M.I.; Kotel'nikov, K.A.; Martynov, A.G.; Rappoport, V.M.; Smirnitskij, V.A.; Ozerskij, M.A.

    1987-01-01

    The automatized ''Micron'' system designed for measuring, processing and analyzing events in nuclear photoemulsion is described. The flowsheets of the device, program packages for searching neutrino interactions in nuclear photoemulsion and plotting target diagrams in X-ray emulsion chambers are presented. The ''Micron'' system consists of the following functional units: a three-coordinate measuring microscope MPEh-11 combined with a coordinate recording unit, designed for measuring coordinates of grains in the emulsion and displaying them on a peripheral, a control unit based on ''Elektronika-60'' microcomputer, a controller KK-60 for connecting the CAMAC highway, an analog-to-digital display with the keyboard. The PDP-11/70 is the basic computer. The event of charmed Λ c + barion production followed by the Λ c + →Σ + π + π - decay observed in nuclear photoemulsion is described

  13. Automatic measurement and analysis of neonatal O2 consumption and CO2 production

    Science.gov (United States)

    Chang, Jyh-Liang; Luo, Ching-Hsing; Yeh, Tsu-Fuh

    1996-02-01

    It is difficult to estimate daily energy expenditure unless continuous O2 consumption (VO2) and CO2 production (VCO2) can be measured. This study describes a simple method for calculating daily and interim changes in O2 consumption and CO2 production for neonates, especially for premature infants. Oxygen consumption and CO2 production are measured using a flow-through technique in which the total VO2 and VCO2 over a given period of time are determined through a computerized system. This system can automatically calculate VO2 and VCO2 not only minute to minute but also over a period of time, e.g., 24 h. As a result, it provides a better indirect reflection of the accurate energy expenditure in an infant's daily life and can be used at the bedside of infants during their ongoing nursery care.

  14. Development of an Algorithm for Automatic Analysis of the Impedance Spectrum Based on a Measurement Model

    Science.gov (United States)

    Kobayashi, Kiyoshi; Suzuki, Tohru S.

    2018-03-01

    A new algorithm for the automatic estimation of an equivalent circuit and the subsequent parameter optimization is developed by combining the data-mining concept and complex least-squares method. In this algorithm, the program generates an initial equivalent-circuit model based on the sampling data and then attempts to optimize the parameters. The basic hypothesis is that the measured impedance spectrum can be reproduced by the sum of the partial-impedance spectra presented by the resistor, inductor, resistor connected in parallel to a capacitor, and resistor connected in parallel to an inductor. The adequacy of the model is determined by using a simple artificial-intelligence function, which is applied to the output function of the Levenberg-Marquardt module. From the iteration of model modifications, the program finds an adequate equivalent-circuit model without any user input to the equivalent-circuit model.

  15. Automatic airway-artery analysis on lung CT to quantify airway wall thickening and bronchiectasis

    DEFF Research Database (Denmark)

    Perez-Rovira, Adria; Kuo, Wieying; Petersen, Jens

    2016-01-01

    Purpose: Bronchiectasis and airway wall thickening are commonly assessed in computed tomography (CT) by comparing the airway size with the size of the accompanying artery. Thus, in order to automate the quantification of bronchiectasis and wall thickening following a similar principle......, and pairs airway branches with the accompanying artery, then quantifies airway wall thickening and bronchiectasis by measuring the wall-artery ratio (WAR) and lumen and outer wall airway-artery ratio (AAR). Measurements that do not use the artery size for normalization are also extracted, including wall...... area percentage (WAP), wall thickness ratio (WTR), and airway diameters. Results: The method was thoroughly evaluated using 8000 manual annotations of airway-artery pairs from 24 full-inspiration pediatric CT scans (12 diseased and 12 controls). Limits of agreement between the automatically...

  16. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    Science.gov (United States)

    Rashidi, Mehdi; Dehmeshki, Jamshid; Dickenson, Eric; Daemi, M. Farhang

    1997-10-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurement within a porous medium. An aqueous fluid lace with a fluorescent dye to microspheres flows through a transparent, refractive-index-matched column packed with transparent crystals. For illumination purposes, a planar sheet of laser passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fields have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows through the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder. The recorded images are acquired automatically frame by frame and transferred to the computer for processing, by using a frame grabber an written relevant algorithms through an RS-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these enhanced particles are monitored to calculate velocity vectors in the plane of the beam. For concentration measurements, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact images that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentrations as a function of time within the porous column.

  17. Automatic analysis of altered gait in arylsulphatase A-deficient mice in the open field.

    Science.gov (United States)

    Leroy, Toon; Stroobants, Stijn; Aerts, Jean-Marie; D'Hooge, Rudi; Berckmans, Daniel

    2009-08-01

    In current research with laboratory animals, observing their dynamic behavior or locomotion is a labor-intensive task. Automatic continuous monitoring can provide quantitative data on each animal's condition and coordination ability. The objective of the present work is to develop an automated mouse observation system integrated with a conventional open-field test for motor function evaluation. Data were acquired from 86 mice having a targeted disruption of the arylsulphatase A (ASA) gene and having lowered coordinated locomotion abilities as a symptom. The mice used were 36 heterozygotes (12 females) and 50 knockout mice (30 females) at the age of 6 months. The mice were placed one at a time into the test setup, which consisted of a Plexiglas cage (53x34.5x26 cm) and two fluorescent bulbs for proper illumination. The transparent cage allowed images to be captured from underneath the cage, so image information could be obtained about the dynamic variation of the positions of the limbs of the mice for gait reconstruction. Every mouse was recorded for 10 min. Background subtraction and color filtering were used to measure and calculate image features, which are variables that contain crucial information, such as the mouse's position, orientation, body outline, and possible locations for the mouse's paws. A set of heuristic rules was used to prune implausible paw features and label the remaining ones as front/hind and left/right. After we had pruned the implausible paw features, the paw features that were consistent over subsequent images were matched to footprints. Finally, from the measured footprint sequence, eight parameters were calculated in order to quantify the gait of the mouse. This automatic observation technique can be integrated with a regular open-field test, where the trajectory and motor function of a free-moving mouse are measured simultaneously.

  18. Quantitative right and left ventricular functional analysis during gated whole-chest MDCT: A feasibility study comparing automatic segmentation to semi-manual contouring

    International Nuclear Information System (INIS)

    Coche, Emmanuel; Walker, Matthew J.; Zech, Francis; Crombrugghe, Rodolphe de; Vlassenbroek, Alain

    2010-01-01

    Purpose: To evaluate the feasibility of an automatic, whole-heart segmentation algorithm for measuring global heart function from gated, whole-chest MDCT images. Material and methods: 15 patients with suspicion of PE underwent whole-chest contrast-enhanced MDCT with retrospective ECG synchronization. Two observers computed right and left ventricular functional indices using a semi-manual and an automatic whole-heart segmentation algorithm. The two techniques were compared using Bland-Altman analysis and paired Student's t-test. Measurement reproducibility was calculated using intraclass correlation coefficient. Results: Ventricular analysis with automatic segmentation was successful in 13/15 (86%) and in 15/15 (100%) patients for the right ventricle and left ventricle, respectively. Reproducibility of measurements for both ventricles was perfect (ICC: 1.00) and very good for automatic and semi-manual measurements, respectively. Ventricular volumes and functional indices except right ventricular ejection fraction obtained from the automatic method were significantly higher for the RV compared to the semi-manual methods. Conclusions: The automatic, whole-heart segmentation algorithm enabled highly reproducible global heart function to be rapidly obtained in patients undergoing gated whole-chest MDCT for assessment of acute chest pain with suspicion of pulmonary embolism.

  19. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    Science.gov (United States)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  20. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    Kojima, Shigeo; Onoue, Akira; Kawai, Katsunori

    1998-01-01

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  1. Automatic analysis and reduction of reaction mechanisms for complex fuel combustion

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Daniel

    2001-05-01

    This work concentrates on automatic procedures for simplifying chemical models for realistic fuels using skeletal mechanism construction and Quasi Steady-State Approximation (QSSA) applied to detailed reaction mechanisms. To automate the selection of species for removal or approximation, different indices for species ranking have thus been proposed. Reaction flow rates are combined with sensitivity information for targeting a certain quantity, and used to determine a level of redundancy for automatic skeletal mechanism construction by exclusion of redundant species. For QSSA reduction, a measure of species lifetime can be used for species ranking as-is, weighted by concentrations or molecular transport timescales, and/or combined with species sensitivity. Maximum values of the indices are accumulated over ranges of parameters, (e.g. fuel-air ratio and octane number), and species with low accumulated index values are selected for removal or steady-state approximation. In the case of QSSA, a model with a certain degree of reduction is automatically implemented as FORTRAN code by setting a certain index limit. The code calculates source terms of explicitly handled species from reaction rates and the steady-state concentrations by internal iteration. Homogeneous-reactor and one-dimensional laminar-flame models were used as test cases. A staged combustor fuelled by ethylene with monomethylamine addition is modelled by two homogeneous reactors in sequence, i.e. a PSR (Perfectly Stirred Reactor) followed by a PFR (Plug Flow Reactor). A modified PFR model was applied for simulation of a Homogeneous Charge Compression Ignition (HCCI) engine fuelled with four-component natural gas, whereas a two-zone model was required for a knocking Spark Ignition (SI) engine powered by Primary Reference Fuel (PRF). Finally, a laminar one-dimensional model was used to simulate premixed flames burning methane and an aeroturbine kerosene surrogate consisting of n-decane and toluene. In

  2. Design and fabrication of an automatic dual axis solar tracker by ...

    African Journals Online (AJOL)

    International Journal of Engineering, Science and Technology. Vol. 9, No. 2, 2017, pp. ... All rights reserved. Design and fabrication of an automatic dual axis solar tracker by using LDR ..... It may be like a wiper of the car. Nomenclature. LDR.

  3. On-line measurement with automatic emulsion analysis system and off-line data processing (E531 neutrino experiment)

    International Nuclear Information System (INIS)

    Miyanishi, Motoaki

    1984-01-01

    The automatic emulsion analysis system developed by Nagoya cosmic ray observation group was practically used for the experiment (FNAL-E531) on determining the lifetime of charm particles for the first time in the world, and achieved a great successful result. The system consists of four large precise coordinate-measuring stages capable of conducting simultaneous measurement and multiple (currently four) DOMS (digitized on-line microscope), supported with one mini-computer (ECLIPS S/130). The purpose of E531 experiment was the determination of charm particle lifetime. The experiment was carried out at FNAL, USA, and by the irradiation of wide band ν sub(μ) beam equivalent to 7 x 10 18 of 350 GeV/c protons. The detector was a hybrid system of emulsions and a counter spectrometer. The scan of neutrino reaction, the scan of charm particles, and charm event measurement were analyzed in emulsions, and the on-line programs for-respective analyses were created. Nagoya group has found 726 neutrino reactions in the first run, obtained 37 charm particle candidates, and found 1442 neutrino reactions in the second run, and obtained 56 charm particle candidates. The capability of the automatic emulsion analysis system in terms of the time equired for analysis is in total 3.5 hours per event; 15 minutes for C.S. scan, 15 minutes for coupling to module, 20 minutes for tracing to vertex, 1 hour for neutrino reaction measurement, 10 minutes for offline data processing and 1.5 hours for charm particle scanning. (Wakatsuki, Y.)

  4. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents.

    Science.gov (United States)

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing.

  5. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  6. Bus Travel Time Deviation Analysis Using Automatic Vehicle Location Data and Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Xiaolin Gong

    2015-01-01

    Full Text Available To investigate the influences of causes of unreliability and bus schedule recovery phenomenon on microscopic segment-level travel time variance, this study adopts Structural Equation Modeling (SEM to specify, estimate, and measure the theoretical proposed models. The SEM model establishes and verifies hypotheses for interrelationships among travel time deviations, departure delays, segment lengths, dwell times, and number of traffic signals and access connections. The finally accepted model demonstrates excellent fitness. Most of the hypotheses are supported by the sample dataset from bus Automatic Vehicle Location system. The SEM model confirms the bus schedule recovery phenomenon. The departure delays at bus terminals and upstream travel time deviations indeed have negative impacts on travel time fluctuation of buses en route. Meanwhile, the segment length directly and negatively impacts travel time variability and inversely positively contributes to the schedule recovery process; this exogenous variable also indirectly and positively influences travel times through the existence of signalized intersections and access connections. This study offers a rational approach to analyzing travel time deviation feature. The SEM model structure and estimation results facilitate the understanding of bus service performance characteristics and provide several implications for bus service planning, management, and operation.

  7. Analysis of electric energy consumption of automatic milking systems in different configurations and operative conditions.

    Science.gov (United States)

    Calcante, Aldo; Tangorra, Francesco M; Oberti, Roberto

    2016-05-01

    Automatic milking systems (AMS) have been a revolutionary innovation in dairy cow farming. Currently, more than 10,000 dairy cow farms worldwide use AMS to milk their cows. Electric consumption is one of the most relevant and uncontrollable operational cost of AMS, ranging between 35 and 40% of their total annual operational costs. The aim of the present study was to measure and analyze the electric energy consumption of 4 AMS with different configurations: single box, central unit featuring a central vacuum system for 1 cow unit and for 2 cow units. The electrical consumption (daily consumption, daily consumption per cow milked, consumption per milking, and consumption per 100L of milk) of each AMS (milking unit + air compressor) was measured using 2 energy analyzers. The measurement period lasted 24h with a sampling frequency of 0.2Hz. The daily total energy consumption (milking unit + air compressor) ranged between 45.4 and 81.3 kWh; the consumption per cow milked ranged between 0.59 and 0.99 kWh; the consumption per milking ranged between 0.21 and 0.33 kWh; and the consumption per 100L of milk ranged between 1.80 to 2.44 kWh according to the different configurations and operational contexts considered. Results showed that AMS electric consumption was mainly conditioned by farm management rather than machine characteristics/architectures. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. An Automatic Multi-Target Independent Analysis Framework for Non-Planar Infrared-Visible Registration.

    Science.gov (United States)

    Sun, Xinglong; Xu, Tingfa; Zhang, Jizhou; Zhao, Zishu; Li, Yuankun

    2017-07-26

    In this paper, we propose a novel automatic multi-target registration framework for non-planar infrared-visible videos. Previous approaches usually analyzed multiple targets together and then estimated a global homography for the whole scene, however, these cannot achieve precise multi-target registration when the scenes are non-planar. Our framework is devoted to solving the problem using feature matching and multi-target tracking. The key idea is to analyze and register each target independently. We present a fast and robust feature matching strategy, where only the features on the corresponding foreground pairs are matched. Besides, new reservoirs based on the Gaussian criterion are created for all targets, and a multi-target tracking method is adopted to determine the relationships between the reservoirs and foreground blobs. With the matches in the corresponding reservoir, the homography of each target is computed according to its moving state. We tested our framework on both public near-planar and non-planar datasets. The results demonstrate that the proposed framework outperforms the state-of-the-art global registration method and the manual global registration matrix in all tested datasets.

  9. Automatic parameterization and analysis of stellar atmospheres: a study of the DA white dwarfs

    International Nuclear Information System (INIS)

    McMahan, R.K. Jr.

    1986-01-01

    A method for automatically calculating atmospheric parameters of hydrogen-rich degenerate stars from low resolution spectra is advanced and then applied to the spectra of 53 DA white dwarfs. All data were taken using the Mark II spectrograph on the McGraw-Hill 1.3 m telescope and cover the spectral range λλ4100-7000 at a resolution of eight Angstroms. The model grid was generated at Dartmouth using the atmosphere code LUCIFER; it contained over 275 synthetic spectra extending from 6000 to 100,000 K in effective temperature and 7.4-9.3 in log g. A new value for the width of the DA mass distribution was achieved using the techniques presented here. Accuracies in the atmospheric parameters greater than twice those previously published were obtained. These results place strict constraints on the magnitude of mass loss in stars in the red giant phase, as well as in the mechanisms responsible for the loss

  10. Semi-automatic analysis of standard uptake values in serial PET/CT studies in patients with lung cancer and lymphoma

    Directory of Open Access Journals (Sweden)

    Ly John

    2012-04-01

    Full Text Available Abstract Background Changes in maximum standardised uptake values (SUVmax between serial PET/CT studies are used to determine disease progression or regression in oncologic patients. To measure these changes manually can be time consuming in a clinical routine. A semi-automatic method for calculation of SUVmax in serial PET/CT studies was developed and compared to a conventional manual method. The semi-automatic method first aligns the serial PET/CT studies based on the CT images. Thereafter, the reader selects an abnormal lesion in one of the PET studies. After this manual step, the program automatically detects the corresponding lesion in the other PET study, segments the two lesions and calculates the SUVmax in both studies as well as the difference between the SUVmax values. The results of the semi-automatic analysis were compared to that of a manual SUVmax analysis using a Philips PET/CT workstation. Three readers did the SUVmax readings in both methods. Sixteen patients with lung cancer or lymphoma who had undergone two PET/CT studies were included. There were a total of 26 lesions. Results Linear regression analysis of changes in SUVmax show that intercepts and slopes are close to the line of identity for all readers (reader 1: intercept = 1.02, R2 = 0.96; reader 2: intercept = 0.97, R2 = 0.98; reader 3: intercept = 0.99, R2 = 0.98. Manual and semi-automatic method agreed in all cases whether SUVmax had increased or decreased between the serial studies. The average time to measure SUVmax changes in two serial PET/CT examinations was four to five times longer for the manual method compared to the semi-automatic method for all readers (reader 1: 53.7 vs. 10.5 s; reader 2: 27.3 vs. 6.9 s; reader 3: 47.5 vs. 9.5 s; p Conclusions Good agreement was shown in assessment of SUVmax changes between manual and semi-automatic method. The semi-automatic analysis was four to five times faster to perform than the manual analysis. These findings show the

  11. Automatic analysis of slips of the tongue: Insights into the cognitive architecture of speech production.

    Science.gov (United States)

    Goldrick, Matthew; Keshet, Joseph; Gustafson, Erin; Heller, Jordana; Needle, Jeremy

    2016-04-01

    Traces of the cognitive mechanisms underlying speaking can be found within subtle variations in how we pronounce sounds. While speech errors have traditionally been seen as categorical substitutions of one sound for another, acoustic/articulatory analyses show they partially reflect the intended sound. When "pig" is mispronounced as "big," the resulting /b/ sound differs from correct productions of "big," moving towards intended "pig"-revealing the role of graded sound representations in speech production. Investigating the origins of such phenomena requires detailed estimation of speech sound distributions; this has been hampered by reliance on subjective, labor-intensive manual annotation. Computational methods can address these issues by providing for objective, automatic measurements. We develop a novel high-precision computational approach, based on a set of machine learning algorithms, for measurement of elicited speech. The algorithms are trained on existing manually labeled data to detect and locate linguistically relevant acoustic properties with high accuracy. Our approach is robust, is designed to handle mis-productions, and overall matches the performance of expert coders. It allows us to analyze a very large dataset of speech errors (containing far more errors than the total in the existing literature), illuminating properties of speech sound distributions previously impossible to reliably observe. We argue that this provides novel evidence that two sources both contribute to deviations in speech errors: planning processes specifying the targets of articulation and articulatory processes specifying the motor movements that execute this plan. These findings illustrate how a much richer picture of speech provides an opportunity to gain novel insights into language processing. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Automatic Monitoring System Design and Failure Probability Analysis for River Dikes on Steep Channel

    Science.gov (United States)

    Chang, Yin-Lung; Lin, Yi-Jun; Tung, Yeou-Koung

    2017-04-01

    The purposes of this study includes: (1) design an automatic monitoring system for river dike; and (2) develop a framework which enables the determination of dike failure probabilities for various failure modes during a rainstorm. The historical dike failure data collected in this study indicate that most dikes in Taiwan collapsed under the 20-years return period discharge, which means the probability of dike failure is much higher than that of overtopping. We installed the dike monitoring system on the Chiu-She Dike which located on the middle stream of Dajia River, Taiwan. The system includes: (1) vertical distributed pore water pressure sensors in front of and behind the dike; (2) Time Domain Reflectometry (TDR) to measure the displacement of dike; (3) wireless floating device to measure the scouring depth at the toe of dike; and (4) water level gauge. The monitoring system recorded the variation of pore pressure inside the Chiu-She Dike and the scouring depth during Typhoon Megi. The recorded data showed that the highest groundwater level insides the dike occurred 15 hours after the peak discharge. We developed a framework which accounts for the uncertainties from return period discharge, Manning's n, scouring depth, soil cohesion, and friction angle and enables the determination of dike failure probabilities for various failure modes such as overtopping, surface erosion, mass failure, toe sliding and overturning. The framework was applied to Chiu-She, Feng-Chou, and Ke-Chuang Dikes on Dajia River. The results indicate that the toe sliding or overturning has the highest probability than other failure modes. Furthermore, the overall failure probability (integrate different failure modes) reaches 50% under 10-years return period flood which agrees with the historical failure data for the study reaches.

  13. Automatic Approach to Morphological Classification of Galaxies With Analysis of Galaxy Populations in Clusters

    Science.gov (United States)

    Sultanova, Madina; Barkhouse, Wayne; Rude, Cody

    2018-01-01

    The classification of galaxies based on their morphology is a field in astrophysics that aims to understand galaxy formation and evolution based on their physical differences. Whether structural differences are due to internal factors or a result of local environment, the dominate mechanism that determines galaxy type needs to be robustly quantified in order to have a thorough grasp of the origin of the different types of galaxies. The main subject of my Ph.D. dissertation is to explore the use of computers to automatically classify and analyze large numbers of galaxies according to their morphology, and to analyze sub-samples of galaxies selected by type to understand galaxy formation in various environments. I have developed a computer code to classify galaxies by measuring five parameters from their images in FITS format. The code was trained and tested using visually classified SDSS galaxies from Galaxy Zoo and the EFIGI data set. I apply my morphology software to numerous galaxies from diverse data sets. Among the data analyzed are the 15 Abell galaxy clusters (0.03 Frontier Field galaxy clusters. The high resolution of HST allows me to compare distant clusters with those nearby to look for evolutionary changes in the galaxy cluster population. I use the results from the software to examine the properties (e.g. luminosity functions, radial dependencies, star formation rates) of selected galaxies. Due to the large amount of data that will be available from wide-area surveys in the future, the use of computer software to classify and analyze the morphology of galaxies will be extremely important in terms of efficiency. This research aims to contribute to the solution of this problem.

  14. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  15. Automatic knowledge extraction in sequencing analysis with multiagent system and grid computing.

    Science.gov (United States)

    González, Roberto; Zato, Carolina; Benito, Rocío; Bajo, Javier; Hernández, Jesús M; De Paz, Juan F; Vera, Vicente; Corchado, Juan M

    2012-12-01

    Advances in bioinformatics have contributed towards a significant increase in available information. Information analysis requires the use of distributed computing systems to best engage the process of data analysis. This study proposes a multiagent system that incorporates grid technology to facilitate distributed data analysis by dynamically incorporating the roles associated to each specific case study. The system was applied to genetic sequencing data to extract relevant information about insertions, deletions or polymorphisms.

  16. Automatic knowledge extraction in sequencing analysis with multiagent system and grid computing

    Directory of Open Access Journals (Sweden)

    González Roberto

    2012-12-01

    Full Text Available Advances in bioinformatics have contributed towards a significant increase in available information. Information analysis requires the use of distributed computing systems to best engage the process of data analysis. This study proposes a multiagent system that incorporates grid technology to facilitate distributed data analysis by dynamically incorporating the roles associated to each specific case study. The system was applied to genetic sequencing data to extract relevant information about insertions, deletions or polymorphisms.

  17. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    Directory of Open Access Journals (Sweden)

    T. O. Chan

    2013-10-01

    Full Text Available At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest

  18. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    International Nuclear Information System (INIS)

    Fang, Y; Huang, H; Su, T

    2015-01-01

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  19. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    Science.gov (United States)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  20. Application of recurrence quantification analysis to automatically estimate infant sleep states using a single channel of respiratory data.

    Science.gov (United States)

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2012-08-01

    Previous work has identified that non-linear variables calculated from respiratory data vary between sleep states, and that variables derived from the non-linear analytical tool recurrence quantification analysis (RQA) are accurate infant sleep state discriminators. This study aims to apply these discriminators to automatically classify 30 s epochs of infant sleep as REM, non-REM and wake. Polysomnograms were obtained from 25 healthy infants at 2 weeks, 3, 6 and 12 months of age, and manually sleep staged as wake, REM and non-REM. Inter-breath interval data were extracted from the respiratory inductive plethysmograph, and RQA applied to calculate radius, determinism and laminarity. Time-series statistic and spectral analysis variables were also calculated. A nested cross-validation method was used to identify the optimal feature subset, and to train and evaluate a linear discriminant analysis-based classifier. The RQA features radius and laminarity and were reliably selected. Mean agreement was 79.7, 84.9, 84.0 and 79.2 % at 2 weeks, 3, 6 and 12 months, and the classifier performed better than a comparison classifier not including RQA variables. The performance of this sleep-staging tool compares favourably with inter-human agreement rates, and improves upon previous systems using only respiratory data. Applications include diagnostic screening and population-based sleep research.

  1. Automatic Condition Monitoring of Industrial Rolling-Element Bearings Using Motor’s Vibration and Current Analysis

    Directory of Open Access Journals (Sweden)

    Zhenyu Yang

    2015-01-01

    Full Text Available An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated by the potential commercialization, the developed system is promoted mainly using off-the-shelf techniques, that is, the high-frequency resonance technique with envelope detection and the average of short-time Fourier transform. In order to test the flexibility and robustness, the monitoring performance is extensively studied under diverse operating conditions: different sensor locations, motor speeds, loading conditions, and data samples from different time segments. The experimental results showed the powerful capability of vibration analysis in the bearing point defect fault diagnosis. The current analysis also showed a moderate capability in diagnosis of point defect faults depending on the type of fault, severity of the fault, and the operational condition. The temporal feature indicated a feasibility to detect generalized roughness fault. The practical issues, such as deviations of predicted characteristic frequencies, sideband effects, time-average of spectra, and selection of fault index and thresholds, are also discussed. The experimental work shows a huge potential to use some simple methods for successful diagnosis of industrial bearing systems.

  2. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    Science.gov (United States)

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  3. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  4. Clarifying Inconclusive Functional Analysis Results: Assessment and Treatment of Automatically Reinforced Aggression

    Science.gov (United States)

    Saini, Valdeep; Greer, Brian D.; Fisher, Wayne W.

    2016-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy’s functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences maintaining aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation. PMID:25891269

  5. Analysis of DGNB-DK criteria for BIM-based Model Checking automatization

    DEFF Research Database (Denmark)

    Gade, Peter Nørkjær; Svidt, Kjeld; Jensen, Rasmus Lund

    This report includes the results of an analysis of the automation potential of the Danish edition of building sustainability assessment method Deutsche Gesellschaft für Nachhaltiges Bauen (DGNB) for office buildings version 2014 1.1. The analysis investigate the criteria related to DGNB-DK and if......-DK and if they would be suited for automation through the technological concept BIM-based Model Checking (BMC)....

  6. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    Science.gov (United States)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  7. MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format.

    Science.gov (United States)

    Ahmed, Zeeshan; Dandekar, Thomas

    2015-01-01

    Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography  (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool 'Mining Scientific Literature (MSL)', which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system's output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format.

  8. Vol 41 No 2

    African Journals Online (AJOL)

    Esem

    3 Centre for Primary Care Research. Medical Journal of Zambia, Vol. 41, No. 2: 59 - 64 (2014) ... pollutants by inhaling second-hand tobacco smoke are at risk of adverse health ..... To put the measured PM levels into perspective, a. 2.5. 5.

  9. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China.

    Science.gov (United States)

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-05-20

    In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.

  10. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  11. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  12. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    of islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...

  13. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    Science.gov (United States)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  14. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    Science.gov (United States)

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  15. Quantitative assessment of intermetallic phase precipitation in a super duplex stainless steel weld metal using automatic image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, A. [AB Sandvik Steel, Sandviken (Sweden). R and D Centre; Nilsson, J.-O. [AB Sandvik Steel, R and D Centre, Sandviken (Sweden); Bonollo, F. [Univ. di Padova, DTGSI, Vicenza (Italy)

    1999-07-01

    The microstructure of weld metal of the type 25%Cr-10%Ni-4%Mo-0.28%N in both as-welded and isothermally heat treated (temperature range: 700-1050 C: time range: 10s-72h) conditions has been investigated. Multipass welding was performed in Ar+2%N{sub 2} atmosphere using GTAW. By means of the electron diffraction technique. {sigma}-phase and {chi}-phase were detected and investigated. {chi}-phase precipitated more readily than {sigma}-phase and was found to be a precursor to {sigma}-phase by providing suitable nucleation sites. Quantitative image analysis of ferrite and intermetallic phases was performed as well as manual point counting (ISO 9042). Automatic image analysis was found to be more accurate. The results were used to assess the TTT-diagram with respect to intermetallic phase formation. On the basis of these results a CCT-diagram was computed, considering the intermetallic phase formation described by an Avrami type equation and adopting the additivity rule. (orig.)

  16. Parameter design and performance analysis of shift actuator for a two-speed automatic mechanical transmission for pure electric vehicles

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2016-08-01

    Full Text Available Recent developments of pure electric vehicles have shown that pure electric vehicles equipped with two-speed or multi-speed gearbox possess higher energy efficiency by ensuring the drive motor operates at its peak performance range. This article presents the design, analysis, and control of a two-speed automatic mechanical transmission for pure electric vehicles. The shift actuator is based on a motor-controlled camshaft where a special geometric groove is machined, and the camshaft realizes the axial positions of the synchronizer sleeve for gear engaging, disengaging, and speed control of the drive motor. Based on the force analysis of shift process, the parameters of shift actuator and shift motor are designed. The drive motor’s torque control strategy before shifting, speed governing control strategy before engaging, shift actuator’s control strategy during gear engaging, and drive motor’s torque recovery strategy after shift process are proposed and implemented with a prototype. To validate the performance of the two-speed gearbox, a test bed was developed based on dSPACE that emulates various operation conditions. The experimental results indicate that the shift process with the proposed shift actuator and control strategy could be accomplished within 1 s under various operation conditions, with shift smoothness up to passenger car standard.

  17. Big Data Analysis for Personalized Health Activities: Machine Learning Processing for Automatic Keyword Extraction Approach

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2018-04-01

    Full Text Available The obese population is increasing rapidly due to the change of lifestyle and diet habits. Obesity can cause various complications and is becoming a social disease. Nonetheless, many obese patients are unaware of the medical treatments that are right for them. Although a variety of online and offline obesity management services have been introduced, they are still not enough to attract the attention of users and are not much of help to solve the problem. Obesity healthcare and personalized health activities are the important factors. Since obesity is related to lifestyle habits, eating habits, and interests, I concluded that the big data analysis of these factors could deduce the problem. Therefore, I collected big data by applying the machine learning and crawling method to the unstructured citizen health data in Korea and the search data of Naver, which is a Korean portal company, and Google for keyword analysis for personalized health activities. It visualized the big data using text mining and word cloud. This study collected and analyzed the data concerning the interests related to obesity, change of interest on obesity, and treatment articles. The analysis showed a wide range of seasonal factors according to spring, summer, fall, and winter. It also visualized and completed the process of extracting the keywords appropriate for treatment of abdominal obesity and lower body obesity. The keyword big data analysis technique for personalized health activities proposed in this paper is based on individual’s interests, level of interest, and body type. Also, the user interface (UI that visualizes the big data compatible with Android and Apple iOS. The users can see the data on the app screen. Many graphs and pictures can be seen via menu, and the significant data values are visualized through machine learning. Therefore, I expect that the big data analysis using various keywords specific to a person will result in measures for personalized

  18. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    International Nuclear Information System (INIS)

    Wei, J; Yuan, A; Li, G

    2014-01-01

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  19. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    Energy Technology Data Exchange (ETDEWEB)

    Wei, J [City College of New York, New York, NY (United States); Yuan, A; Li, G [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  20. Automatic analysis of quality of images from X-ray digital flat detectors

    International Nuclear Information System (INIS)

    Le Meur, Y.

    2009-04-01

    Since last decade, medical imaging has grown up with the development of new digital imaging techniques. In the field of X-ray radiography, new detectors replace progressively older techniques, based on film or x-ray intensifiers. These digital detectors offer a higher sensibility and reduced overall dimensions. This work has been prepared with Trixell, the world leading company in flat detectors for medical radiography. It deals with quality control on digital images stemming from these detectors. High quality standards of medical imaging impose a close analysis of the defects that can appear on the images. This work describes a complete process for quality analysis of such images. A particular focus is given on the detection task of the defects, thanks to methods well adapted to our context of spatially correlated defects in noise background. (author)

  1. Sensitivity analysis of automatic flight control systems using singular value concepts

    Science.gov (United States)

    Herrera-Vaillard, A.; Paduano, J.; Downing, D.

    1985-01-01

    A sensitivity analysis is presented that can be used to judge the impact of vehicle dynamic model variations on the relative stability of multivariable continuous closed-loop control systems. The sensitivity analysis uses and extends the singular-value concept by developing expressions for the gradients of the singular value with respect to variations in the vehicle dynamic model and the controller design. Combined with a priori estimates of the accuracy of the model, the gradients are used to identify the elements in the vehicle dynamic model and controller that could severely impact the system's relative stability. The technique is demonstrated for a yaw/roll damper stability augmentation designed for a business jet.

  2. Automatic data acquisition and on-line analysis of trace element concentration in serum samples

    International Nuclear Information System (INIS)

    Lecomte, R.; Paradis, P.; Monaro, S.

    1978-01-01

    A completely automated system has been developed to determine the trace element concentration in biological samples by measuring charged particle induced X-rays. A CDC-3100 computer with ADC and CAMAC interface is employed to control the data collection apparatus, acquire data and perform simultaneously the analysis. The experimental set-up consists of a large square plexiglass chamber in which a commercially available 750H Kodak Carousel is suitably arranged as a computer controlled sample changer. A method of extracting trace element concentrations using reference spectra is presented and an on-line program has been developed to easily and conveniently obtain final results at the end of each run. (Auth.)

  3. A human error taxonomy and its application to an automatic method accident analysis

    International Nuclear Information System (INIS)

    Matthews, R.H.; Winter, P.W.

    1983-01-01

    Commentary is provided on the quantification aspects of human factors analysis in risk assessment. Methods for quantifying human error in a plant environment are discussed and their application to system quantification explored. Such a programme entails consideration of the data base and a taxonomy of factors contributing to human error. A multi-levelled approach to system quantification is proposed, each level being treated differently drawing on the advantages of different techniques within the fault/event tree framework. Management, as controller of organization, planning and procedure, is assigned a dominant role. (author)

  4. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    Science.gov (United States)

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection

  5. Developing Automatic Form and Design System Using Integrated Grey Relational Analysis and Affective Engineering

    Directory of Open Access Journals (Sweden)

    Chen-Yuan Liu

    2018-01-01

    Full Text Available In the modern highly competitive marketplace and global market environment, product quality improvements that abridge development time and reduce the production costs are effective methods for promoting the business competitiveness of a product in shorter lifecycles. Since the design process is the best time to control such parameters, systematically designing the processes to develop a product that more closely fits the demand requirements for the market is a key factor for developing a successful product. In this paper, a combined affective engineering method and grey relational analysis are used to develop a product design process. First, design image scale technology is used to acquire the best the design criteria factors, and then affective engineering methods are used to set the relationships between customer needs and production factors. Finally, grey relational analysis is used to select the optimal design strategy. Using this systematic design method, a higher quality product can be expanded upon in a shorter lead-time for improving business competition.

  6. Automatic analysis at the commissioning of the LHC superconducting electrical circuits

    International Nuclear Information System (INIS)

    Reymond, H.; Andreassen, O.O.; Charrondiere, C.; Rijllart, A.; Zerlauth, M.

    2012-01-01

    Since the beginning of 2010 the LHC has been operating in a routinely manner, starting with a commissioning phase and then an operation for physics phase. The commissioning of the superconducting electrical circuits requires rigorous test procedures before entering into operation. To maximize the beam operation time of the LHC, these tests should be done as fast as procedures allow. A full commissioning need 12000 tests and is required after circuits have been warmed above liquid nitrogen temperature. Below this temperature, after an end of year break of two months, commissioning needs about 6000 tests. As the manual analysis of the tests takes a major part of the commissioning time, we automated existing analysis tools. We present here how these LabVIEW TM applications were automated, the evaluation of the gain in commissioning time and reduction of experts on night shift observed during the LHC hardware commissioning campaign of 2011 compared to 2010. We end with an outlook at what can be further optimized. (authors)

  7. Automatic Analysis at the Commissioning of the LHC Superconducting Electrical Circuits

    CERN Document Server

    Reymond, H; Charrondiere, C; Rijllart, A; Zerlauth, M

    2011-01-01

    Since the beginning of 2010 the LHC has been operating in a routinely manner, starting with a commissioning phase and then an operation for physics phase. The commissioning of the superconducting electrical circuits requires rigorous test procedures before entering into operation. To maximize the beam operation time of the LHC, these tests should be done as fast as procedures allow. A full commissioning need 12000 tests and is required after circuits have been warmed above liquid nitrogen temperature. Below this temperature, after an end of year break of two months, commissioning needs about 6000 tests. As the manual analysis of the tests takes a major part of the commissioning time, we automated existing analysis tools. We present here how these LabVIEW™ applications were automated, the evaluation of the gain in commissioning time and reduction of experts on night shift observed during the LHC hardware commissioning campaign of 2011 compared to 2010. We end with an outlook at what can be further optimized.

  8. Automatized Assessment of Protective Group Reactivity: A Step Toward Big Reaction Data Analysis.

    Science.gov (United States)

    Lin, Arkadii I; Madzhidov, Timur I; Klimchuk, Olga; Nugmanov, Ramil I; Antipin, Igor S; Varnek, Alexandre

    2016-11-28

    We report a new method to assess protective groups (PGs) reactivity as a function of reaction conditions (catalyst, solvent) using raw reaction data. It is based on an intuitive similarity principle for chemical reactions: similar reactions proceed under similar conditions. Technically, reaction similarity can be assessed using the Condensed Graph of Reaction (CGR) approach representing an ensemble of reactants and products as a single molecular graph, i.e., as a pseudomolecule for which molecular descriptors or fingerprints can be calculated. CGR-based in-house tools were used to process data for 142,111 catalytic hydrogenation reactions extracted from the Reaxys database. Our results reveal some contradictions with famous Greene's Reactivity Charts based on manual expert analysis. Models developed in this study show high accuracy (ca. 90%) for predicting optimal experimental conditions of protective group deprotection.

  9. Automatic neuron segmentation and neural network analysis method for phase contrast microscopy images.

    Science.gov (United States)

    Pang, Jincheng; Özkucur, Nurdan; Ren, Michael; Kaplan, David L; Levin, Michael; Miller, Eric L

    2015-11-01

    Phase Contrast Microscopy (PCM) is an important tool for the long term study of living cells. Unlike fluorescence methods which suffer from photobleaching of fluorophore or dye molecules, PCM image contrast is generated by the natural variations in optical index of refraction. Unfortunately, the same physical principles which allow for these studies give rise to complex artifacts in the raw PCM imagery. Of particular interest in this paper are neuron images where these image imperfections manifest in very different ways for the two structures of specific interest: cell bodies (somas) and dendrites. To address these challenges, we introduce a novel parametric image model using the level set framework and an associated variational approach which simultaneously restores and segments this class of images. Using this technique as the basis for an automated image analysis pipeline, results for both the synthetic and real images validate and demonstrate the advantages of our approach.

  10. Stiffness analysis of spring mechanism for semi automatic gripper motion of tendon driven remote manipulator

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang

    2012-01-01

    Remote handling manipulators are widely used for performing hazardous tasks, and it is essential to ensure the reliable performance of such systems. Toward this end, tendon driven mechanisms are adopted in such systems to reduce the weight of the distal parts of the manipulator while maintaining the handling performance. In this study, several approaches for the design of a gripper system for a tendon driven remote handling system are introduced. Basically, this gripper has an underactuated spring mechanism that is combined with a slave manipulator triggered by a master operator. Based on the requirements under the specified tendon driven mechanism, the connecting position of the spring system on the gripper mechanism and kinematic influence coefficient (KIC) analysis are performed. As a result, a suitable combination of components for the proper design of the target system is presented and verified

  11. Automatic CT-based finite element model generation for temperature-based death time estimation: feasibility study and sensitivity analysis.

    Science.gov (United States)

    Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Erdmann, Bodo; Weiser, Martin; Zachow, Stefan; Heinrich, Andreas; Güttler, Felix Victor; Teichgräber, Ulf; Mall, Gita

    2017-05-01

    Temperature-based death time estimation is based either on simple phenomenological models of corpse cooling or on detailed physical heat transfer models. The latter are much more complex but allow a higher accuracy of death time estimation, as in principle, all relevant cooling mechanisms can be taken into account.Here, a complete workflow for finite element-based cooling simulation is presented. The following steps are demonstrated on a CT phantom: Computer tomography (CT) scan Segmentation of the CT images for thermodynamically relevant features of individual geometries and compilation in a geometric computer-aided design (CAD) model Conversion of the segmentation result into a finite element (FE) simulation model Computation of the model cooling curve (MOD) Calculation of the cooling time (CTE) For the first time in FE-based cooling time estimation, the steps from the CT image over segmentation to FE model generation are performed semi-automatically. The cooling time calculation results are compared to cooling measurements performed on the phantoms under controlled conditions. In this context, the method is validated using a CT phantom. Some of the phantoms' thermodynamic material parameters had to be determined via independent experiments.Moreover, the impact of geometry and material parameter uncertainties on the estimated cooling time is investigated by a sensitivity analysis.

  12. An automatic flow injection analysis procedure for photometric determination of ethanol in red wine without using a chromogenic reagent.

    Science.gov (United States)

    Borges, Sivanildo S; Frizzarin, Rejane M; Reis, Boaventura F

    2006-05-01

    An automatic reagentless photometric procedure for the determination of ethanol in red wine is described. The procedure was based on a falling drop system that was implemented by employing a flow injection analysis manifold. The detection system comprised an infrared LED and a phototransistor. The experimental arrangement was designed to ensure that the wine drop grew between these devices, thus causing a decrease in the intensity of the radiation beam coming from the LED. Since ethanol content affected the size of the wine drop this feature was exploited to develop an analytical procedure for the photometric determination of ethanol in red wine without using a chromogenic reagent. In an attempt to prove the usefulness of the proposed procedure, a set of red wines were analysed. No significant difference between our results and those obtained with a reference method was observed at the 95% confidence level. Other advantages of our method were a linear response ranging from 0.17 up to 5.14 mol L(-1) (1.0 up to 30.0%) ethanol (R = 0.999); a limit of detection of 0.05 mol L(-1) (0.3%) ethanol; a relative standard deviation of 2.5% (n = 10) using typical wine sample containing 2.14 mol L(-1) (12.5%) ethanol; and a sampling rate of 50 determinations per hour.

  13. A new code for automatic detection and analysis of the lineament patterns for geophysical and geological purposes (ADALGEO)

    Science.gov (United States)

    Soto-Pinto, C.; Arellano-Baeza, A.; Sánchez, G.

    2013-08-01

    We present a new numerical method for automatic detection and analysis of changes in lineament patterns caused by seismic and volcanic activities. The method is implemented as a series of modules: (i) normalization of the image contrast, (ii) extraction of small linear features (stripes) through convolution of the part of the image in the vicinity of each pixel with a circular mask or through Canny algorithm, and (iii) posterior detection of main lineaments using the Hough transform. We demonstrate that our code reliably detects changes in the lineament patterns related to the stress evolution in the Earth's crust: specifically, a significant number of new lineaments appear approximately one month before an earthquake, while one month after the earthquake the lineament configuration returns to its initial state. Application of our software to the deformations caused by volcanic activity yields the opposite results: the number of lineaments decreases with the onset of microseismicity. This discrepancy can be explained assuming that the plate tectonic earthquakes are caused by the compression and accumulation of stress in the Earth's crust due to subduction of tectonic plates, whereas in the case of volcanic activity we deal with the inflation of a volcano edifice due to elevation of pressure and magma intrusion and the resulting stretching of the surface.

  14. Automatic moment segmentation and peak detection analysis of heart sound pattern via short-time modified Hilbert transform.

    Science.gov (United States)

    Sun, Shuping; Jiang, Zhongwei; Wang, Haibin; Fang, Yu

    2014-05-01

    This paper proposes a novel automatic method for the moment segmentation and peak detection analysis of heart sound (HS) pattern, with special attention to the characteristics of the envelopes of HS and considering the properties of the Hilbert transform (HT). The moment segmentation and peak location are accomplished in two steps. First, by applying the Viola integral waveform method in the time domain, the envelope (E(T)) of the HS signal is obtained with an emphasis on the first heart sound (S1) and the second heart sound (S2). Then, based on the characteristics of the E(T) and the properties of the HT of the convex and concave functions, a novel method, the short-time modified Hilbert transform (STMHT), is proposed to automatically locate the moment segmentation and peak points for the HS by the zero crossing points of the STMHT. A fast algorithm for calculating the STMHT of E(T) can be expressed by multiplying the E(T) by an equivalent window (W(E)). According to the range of heart beats and based on the numerical experiments and the important parameters of the STMHT, a moving window width of N=1s is validated for locating the moment segmentation and peak points for HS. The proposed moment segmentation and peak location procedure method is validated by sounds from Michigan HS database and sounds from clinical heart diseases, such as a ventricular septal defect (VSD), an aortic septal defect (ASD), Tetralogy of Fallot (TOF), rheumatic heart disease (RHD), and so on. As a result, for the sounds where S2 can be separated from S1, the average accuracies achieved for the peak of S1 (AP₁), the peak of S2 (AP₂), the moment segmentation points from S1 to S2 (AT₁₂) and the cardiac cycle (ACC) are 98.53%, 98.31% and 98.36% and 97.37%, respectively. For the sounds where S1 cannot be separated from S2, the average accuracies achieved for the peak of S1 and S2 (AP₁₂) and the cardiac cycle ACC are 100% and 96.69%. Copyright © 2014 Elsevier Ireland Ltd. All

  15. Automatic analysis of online image data for law enforcement agencies by concept detection and instance search

    Science.gov (United States)

    de Boer, Maaike H. T.; Bouma, Henri; Kruithof, Maarten C.; ter Haar, Frank B.; Fischer, Noëlle M.; Hagendoorn, Laurens K.; Joosten, Bart; Raaijmakers, Stephan

    2017-10-01

    The information available on-line and off-line, from open as well as from private sources, is growing at an exponential rate and places an increasing demand on the limited resources of Law Enforcement Agencies (LEAs). The absence of appropriate tools and techniques to collect, process, and analyze the volumes of complex and heterogeneous data has created a severe information overload. If a solution is not found, the impact on law enforcement will be dramatic, e.g. because important evidence is missed or the investigation time is too long. Furthermore, there is an uneven level of capabilities to deal with the large volumes of complex and heterogeneous data that come from multiple open and private sources at national level across the EU, which hinders cooperation and information sharing. Consequently, there is a pertinent need to develop tools, systems and processes which expedite online investigations. In this paper, we describe a suite of analysis tools to identify and localize generic concepts, instances of objects and logos in images, which constitutes a significant portion of everyday law enforcement data. We describe how incremental learning based on only a few examples and large-scale indexing are addressed in both concept detection and instance search. Our search technology allows querying of the database by visual examples and by keywords. Our tools are packaged in a Docker container to guarantee easy deployment on a system and our tools exploit possibilities provided by open source toolboxes, contributing to the technical autonomy of LEAs.

  16. Automatic classification of singular elements for the electrostatic analysis of microelectromechanical systems

    Science.gov (United States)

    Su, Y.; Ong, E. T.; Lee, K. H.

    2002-05-01

    The past decade has seen an accelerated growth of technology in the field of microelectromechanical systems (MEMS). The development of MEMS products has generated the need for efficient analytical and simulation methods for minimizing the requirement for actual prototyping. The boundary element method is widely used in the electrostatic analysis for MEMS devices. However, singular elements are needed to accurately capture the behavior at singular regions, such as sharp corners and edges, where standard elements fail to give an accurate result. The manual classification of boundary elements based on their singularity conditions is an immensely laborious task, especially when the boundary element model is large. This process can be automated by querying the geometric model of the MEMS device for convex edges based on geometric information of the model. The associated nodes of the boundary elements on these edges can then be retrieved. The whole process is implemented in the MSC/PATRAN platform using the Patran Command Language (the source code is available as supplementary data in the electronic version of this journal issue).

  17. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  18. Elucidating the genotype–phenotype map by automatic enumeration and analysis of the phenotypic repertoire

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2015-01-01

    Background: The gap between genotype and phenotype is filled by complex biochemical systems most of which are poorly understood. Because these systems are complex, it is widely appreciated that quantitative understanding can only be achieved with the aid of mathematical models. However, formulating models and measuring or estimating their numerous rate constants and binding constants is daunting. Here we present a strategy for automating difficult aspects of the process. Methods: The strategy, based on a system design space methodology, is applied to a class of 16 designs for a synthetic gene oscillator that includes seven designs previously formulated on the basis of experimentally measured and estimated parameters. Results: Our strategy provides four important innovations by automating: (1) enumeration of the repertoire of qualitatively distinct phenotypes for a system; (2) generation of parameter values for any particular phenotype; (3) simultaneous realization of parameter values for several phenotypes to aid visualization of transitions from one phenotype to another, in critical cases from functional to dysfunctional; and (4) identification of ensembles of phenotypes whose expression can be phased to achieve a specific sequence of functions for rationally engineering synthetic constructs. Our strategy, applied to the 16 designs, reproduced previous results and identified two additional designs capable of sustained oscillations that were previously missed. Conclusions: Starting with a system’s relatively fixed aspects, its architectural features, our method enables automated analysis of nonlinear biochemical systems from a global perspective, without first specifying parameter values. The examples presented demonstrate the efficiency and power of this automated strategy. PMID:26998346

  19. Elucidating the genotype-phenotype map by automatic enumeration and analysis of the phenotypic repertoire.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    The gap between genotype and phenotype is filled by complex biochemical systems most of which are poorly understood. Because these systems are complex, it is widely appreciated that quantitative understanding can only be achieved with the aid of mathematical models. However, formulating models and measuring or estimating their numerous rate constants and binding constants is daunting. Here we present a strategy for automating difficult aspects of the process. The strategy, based on a system design space methodology, is applied to a class of 16 designs for a synthetic gene oscillator that includes seven designs previously formulated on the basis of experimentally measured and estimated parameters. Our strategy provides four important innovations by automating: (1) enumeration of the repertoire of qualitatively distinct phenotypes for a system; (2) generation of parameter values for any particular phenotype; (3) simultaneous realization of parameter values for several phenotypes to aid visualization of transitions from one phenotype to another, in critical cases from functional to dysfunctional; and (4) identification of ensembles of phenotypes whose expression can be phased to achieve a specific sequence of functions for rationally engineering synthetic constructs. Our strategy, applied to the 16 designs, reproduced previous results and identified two additional designs capable of sustained oscillations that were previously missed. Starting with a system's relatively fixed aspects, its architectural features, our method enables automated analysis of nonlinear biochemical systems from a global perspective, without first specifying parameter values. The examples presented demonstrate the efficiency and power of this automated strategy.

  20. Comparative Analysis between LDR and HDR Images for Automatic Fruit Recognition and Counting

    Directory of Open Access Journals (Sweden)

    Tatiana M. Pinho

    2017-01-01

    Full Text Available Precision agriculture is gaining an increasing interest in the current farming paradigm. This new production concept relies on the use of information technology (IT to provide a control and supervising structure that can lead to better management policies. In this framework, imaging techniques that provide visual information over the farming area play an important role in production status monitoring. As such, accurate representation of the gathered production images is a major concern, especially if those images are used in detection and classification tasks. Real scenes, observed in natural environment, present high dynamic ranges that cannot be represented by the common LDR (Low Dynamic Range devices. However, this issue can be handled by High Dynamic Range (HDR images since they have the ability to store luminance information similarly to the human visual system. In order to prove their advantage in image processing, a comparative analysis between LDR and HDR images, for fruits detection and counting, was carried out. The obtained results show that the use of HDR images improves the detection performance to more than 30% when compared to LDR.

  1. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features.

    Science.gov (United States)

    Chudáček, V; Spilka, J; Janků, P; Koucký, M; Lhotská, L; Huptych, M

    2011-08-01

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel-Ziv complexity and Higuchi's fractal dimension are among the top five features.

  2. Assessment of features for automatic CTG analysis based on expert annotation.

    Science.gov (United States)

    Chudácek, Vacláv; Spilka, Jirí; Lhotská, Lenka; Janku, Petr; Koucký, Michal; Huptych, Michal; Bursa, Miroslav

    2011-01-01

    Cardiotocography (CTG) is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO) since 1960's used routinely by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the ever-used features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and the features are assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. Annotation derived from the panel of experts instead of the commonly utilized pH values was used for evaluation of the features on a large data set (552 records). We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. Number of acceleration and deceleration, interval index, as well as Lempel-Ziv complexity and Higuchi's fractal dimension are among the top five features.

  3. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features

    International Nuclear Information System (INIS)

    Chudáček, V; Spilka, J; Lhotská, L; Huptych, M; Janků, P; Koucký, M

    2011-01-01

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel–Ziv complexity and Higuchi's fractal dimension are among the top five features

  4. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing; Alkhalifah, Tariq Ali

    2017-01-01

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  5. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    Science.gov (United States)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  6. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  7. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    Science.gov (United States)

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  8. Is automatic CPAP titration as effective as manual CPAP titration in OSAHS patients? A meta-analysis.

    Science.gov (United States)

    Gao, Weijie; Jin, Yinghui; Wang, Yan; Sun, Mei; Chen, Baoyuan; Zhou, Ning; Deng, Yuan

    2012-06-01

    It is costly and time-consuming to conduct the standard manual titration to identify an effective pressure before continuous positive airway pressure (CPAP) treatment for obstructive sleep apnea (OSA) patients. Automatic titration is cheaper and more easily available than manual titration. The purpose of this systematic review was to evaluate the effect of automatic titration in identifying a pressure and on the improvement of apnea/hyponea index (AHI) and somnolence, the change of sleep quality, and the acceptance and compliance of CPAP treatment, compared with the manual titration. A systematic search was made of the PubMed, EMBASE, Cochrane Library, SCI, China Academic Journals Full-text Databases, Chinese Biomedical Literature Database, Chinese Scientific Journals Databases and Chinese Medical Association Journals. Randomized controlled trials comparing automatic titration and manual titration were reviewed. Studies were pooled to yield odds ratios (OR) or mean differences (MD) with 95% confidence intervals (CI). Ten trials involving 849 patients met the inclusion criteria. It is hard to identify a trend in the pressures determined by either automatic or manual titration. Automatic titration can improve the AHI (MD = 0.03/h, 95% CI = -4.48 to 4.53) and Epworth sleepiness scale (SMD = -0.02, 95% CI = -0.34 to 0.31,) as effectively as the manual titration. There is no difference between sleep architecture under automatic titration or manual titration. The acceptance of CPAP treatment (OR = 0.96, 95% CI = 0.60 to 1.55) and the compliance with treatment (MD = -0.04, 95% CI = -0.17 to 0.10) after automatic titration is not different from manual titration. Automatic titration is as effective as standard manual titration in improving AHI, somnolence while maintaining sleep quality similar to the standard method. In addition, automatic titration has the same effect on the acceptance and compliance of CPAP treatment as manual titration. With the potential advantage

  9. Method for optical 15N analysis of small amounts of nitrogen gas released from an automatic nitrogen analyzer

    International Nuclear Information System (INIS)

    Arima, Yasuhiro

    1981-01-01

    A method of optical 15 N analysis is proposed for application to small amounts of nitrogen gas released from an automatic nitrogen analyzer (model ANA-1300, Carlo Erba, Milano) subjected to certain set modifications. The ANA-1300 was combined with a vacuum line attached by a molecular sieve 13X column. The nitrogen gas released from the ANA-1300 was introduced with a carrier gas of helium into the molecular sieve column which was pre-evacuated at 10 -4 Torr and cooled with outer liquid nitrogen. After removal of the helium by evacuation, the nitrogen gas fixed on the molecular sieve was released by warming the column, and then, it was sealed into pre-evacuated pyrex glass tubes at 4.5 - 5.0 Torr. In the preparation of discharge tubes, contamination of unlabelled nitrogen occurred from the carrier gas of standard grade helium, and the relative lowering of the 15 N value by it was estimated to be less than 1% when over 700 μg nitrogen was charged on the ANA-1300; when 200 μg nitrogen was charged, it was about 3.5%. However, the effect of the contamination could be corrected for by knowing the amount of contaminant nitrogen. In the analysis of plant materials by the proposed method, the coefficient of variation was less than 2%, and no significant difference was observed between results given by the present method and by the ordinary method in which samples were directly pyrolyzed in the discharge tubes by the Dumas method. The present method revealed about 1.5 μg of cross-contaminated nitrogen and was applicable to more than 200 μg of sample nitrogen. (author)

  10. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  11. Advances in boundary elements. Vol. 1-3

    International Nuclear Information System (INIS)

    Brebbia, C.A.; Connor, J.J.

    1989-01-01

    This book contains some of the edited papers presented at the 11th Boundary Element Conference, held in Cambridge, Massachusetts, during August 1989. The papers are arranged in three different books comprising the following topics: Vol. 1: Computations and Fundamentals - comprises sections on fundamentals, adaptive techniques, error and convergence, numerical methods and computational aspects. (283 p.). Vol. 2: Field and fluid flow solutions - includes the following topics: potential problems, thermal studies, electrical and electromagnetic problems, wave propagation, acoustics and fluid flow. (484 p.). Vol. 3: Stress analysis - deals with advances in linear problems, nonlinear problems, fracture mechanics, contact mechanics, optimization, geomechanics, plates and shells, vibrations and industrial applications. (450 p). (orig./HP)

  12. Automatic generation and simulation of urban building energy models based on city datasets for city-scale building retrofit analysis

    International Nuclear Information System (INIS)

    Chen, Yixing; Hong, Tianzhen; Piette, Mary Ann

    2017-01-01

    Highlights: •Developed methods and used data models to integrate city’s public building records. •Shading from neighborhood buildings strongly influences urban building performance. •A case study demonstrated the workflow, simulation and analysis of building retrofits. •CityBES retrofit analysis feature provides actionable information for decision making. •Discussed significance and challenges of urban building energy modeling. -- Abstract: Buildings in cities consume 30–70% of total primary energy, and improving building energy efficiency is one of the key strategies towards sustainable urbanization. Urban building energy models (UBEM) can support city managers to evaluate and prioritize energy conservation measures (ECMs) for investment and the design of incentive and rebate programs. This paper presents the retrofit analysis feature of City Building Energy Saver (CityBES) to automatically generate and simulate UBEM using EnergyPlus based on cities’ building datasets and user-selected ECMs. CityBES is a new open web-based tool to support city-scale building energy efficiency strategic plans and programs. The technical details of using CityBES for UBEM generation and simulation are introduced, including the workflow, key assumptions, and major databases. Also presented is a case study that analyzes the potential retrofit energy use and energy cost savings of five individual ECMs and two measure packages for 940 office and retail buildings in six city districts in northeast San Francisco, United States. The results show that: (1) all five measures together can save 23–38% of site energy per building; (2) replacing lighting with light-emitting diode lamps and adding air economizers to existing heating, ventilation and air-conditioning (HVAC) systems are most cost-effective with an average payback of 2.0 and 4.3 years, respectively; and (3) it is not economical to upgrade HVAC systems or replace windows in San Francisco due to the city’s mild

  13. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  14. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    Science.gov (United States)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  15. Automatic Functional Harmonic Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Magalhães, J.P.; Wiering, F.; Veltkamp, R.C.

    2013-01-01

    Music scholars have been studying tonal harmony intensively for centuries, yielding numerous theories and models. Unfortunately, a large number of these theories are formulated in a rather informal fashion and lack mathematical precision. In this article we present HarmTrace, a functional model of

  16. Comparison of automatic procedures in the selection of peaks over threshold in flood frequency analysis: A Canadian case study in the context of climate change

    Science.gov (United States)

    Durocher, M.; Mostofi Zadeh, S.; Burn, D. H.; Ashkar, F.

    2017-12-01

    Floods are one of the most costly hazards and frequency analysis of river discharges is an important part of the tools at our disposal to evaluate their inherent risks and to provide an adequate response. In comparison to the common examination of annual streamflow maximums, peaks over threshold (POT) is an interesting alternative that makes better use of the available information by including more than one flood event per year (on average). However, a major challenge is the selection of a satisfactory threshold above which peaks are assumed to respect certain conditions necessary for an adequate estimation of the risk. Additionally, studies have shown that POT is also a valuable approach to investigate the evolution of flood regimes in the context of climate change. Recently, automatic procedures for the selection of the threshold were suggested to guide that important choice, which otherwise rely on graphical tools and expert judgment. Furthermore, having an automatic procedure that is objective allows for quickly repeating the analysis on a large number of samples, which is useful in the context of large databases or for uncertainty analysis based on a resampling approach. This study investigates the impact of considering such procedures in a case study including many sites across Canada. A simulation study is conducted to evaluate the bias and predictive power of the automatic procedures in similar conditions as well as investigating the power of derived nonstationarity tests. The results obtained are also evaluated in the light of expert judgments established in a previous study. Ultimately, this study provides a thorough examination of the considerations that need to be addressed when conducting POT analysis using automatic threshold selection.

  17. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    Science.gov (United States)

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  18. The IDA-80 measurement evaluation programme on mass spectrometric isotope dilution analysis of uranium and plutonium. Vol. 1

    International Nuclear Information System (INIS)

    Beyrich, W.; Golly, W.; Spannagel, G.; Kernforschungszentrum Karlsruhe G.m.b.H.; Bievre, P. de; Wolters, W.

    1984-12-01

    The main objective was the acquisition of basic data on the uncertainties involved in the mass spectrometric isotope dilution analysis as applied to the determination of uranium and plutonium in active feed solutions of reprocessing plants. The element concentrations and isotopic compositions of all test materials used were determined by CBNM and NBS with high accuracy. The more than 60000 analytical data reported by the participating laboratories were evaluated by statistical methods applied mainly to the calculation of estimates of the variances for the different uncertainty components contributing to the total uncertainty of this analytical technique. Attention was given to such topics as sample ageing, influence of fission products, spike calibration, ion fractionation, Pu-241 decay correction, minor isotope measurement and errors in data transfer. Furthermore, the performance of the 'dried sample' technique and the 'in-situ' spiking method of undiluted samples of reprocessing fuel solution with U-235/Pu-242 metal alloy spikes, were tested successfully. Considerable improvement of isotope dilution analysis in this safeguards relevant application during the last decade is shown as compared to the results obtained in the IDA-72 interlaboratory experiment, organized by KfK in 1972 on the same subject. (orig./HP) [de

  19. A Noise-Assisted Data Analysis Method for Automatic EOG-Based Sleep Stage Classification Using Ensemble Learning.

    Science.gov (United States)

    Olesen, Alexander Neergaard; Christensen, Julie A E; Sorensen, Helge B D; Jennum, Poul J

    2016-08-01

    Reducing the number of recording modalities for sleep staging research can benefit both researchers and patients, under the condition that they provide as accurate results as conventional systems. This paper investigates the possibility of exploiting the multisource nature of the electrooculography (EOG) signals by presenting a method for automatic sleep staging using the complete ensemble empirical mode decomposition with adaptive noise algorithm, and a random forest classifier. It achieves a high overall accuracy of 82% and a Cohen's kappa of 0.74 indicating substantial agreement between automatic and manual scoring.

  20. Geomatic Methods for the Analysis of Data in the Earth Sciences: Lecture Notes in Earth Sciences, Vol. 95

    Science.gov (United States)

    Pavlis, Nikolaos K.

    Geomatics is a trendy term that has been used in recent years to describe academic departments that teach and research theories, methods, algorithms, and practices used in processing and analyzing data related to the Earth and other planets. Naming trends aside, geomatics could be considered as the mathematical and statistical “toolbox” that allows Earth scientists to extract information about physically relevant parameters from the available data and accompany such information with some measure of its reliability. This book is an attempt to present the mathematical-statistical methods used in data analysis within various disciplines—geodesy, geophysics, photogrammetry and remote sensing—from a unifying perspective that inverse problem formalism permits. At the same time, it allows us to stretch the relevance of statistical methods in achieving an optimal solution.

  1. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    Science.gov (United States)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    Accurate knowledge of fabric anisotropy is crucial to understand the mechanical behavior of snow and firn, but is also important for understanding metamorphism. Computer-Integrated Polarization Microscopy (CIP) method used for the fabric analysis was developed by Heilbronner and Pauli in the early 1990ies and uses a slightly modified traditional polarization microscope for the fabric analysis. First developed for quartz, it can be applied to other uniaxial minerals. Up to now this method was mainly used in structural geology. However, it is also well suited for the fabric analysis of snow, firn and ice. The method is based on the analysis of first- order interference colors images by a slightly modified optical polarization microscope, a grayscale camera and a computer. The optical polarization microscope is featured with high quality objectives, a rotating table and two polarizers that can be introduced above and below the thin section, as well as a full wave plate. Additionally, two quarter-wave plates for circular polarization are needed. Otherwise it is also possible to create circular polarization from a set of crossed polarized images through image processing. A narrow band interference filter transmitting a wavelength between 660 and 700 nm is also required. Finally a monochrome digital camera is used to capture the input images. The idea is to record the change of interference colors while the thin section is being rotated once through 180°. The azimuth and inclination of the c-axis are defined by the color change. Recording the color change through a red filter produces a signal with a well-defined amplitude and phase angle. An advantage of this method lies in the simple conversion of an ordinary optical microscope to a fabric analyzer. The Automatic Ice Texture Analyzer (AITA) as the first fully functional instrument to measure c-axis orientation was developed by Wilson and other (2003). Most recent fabric analysis of snow and firn samples was carried

  2. Combining Recurrence Analysis and Automatic Movement Extraction from Video Recordings to Study Behavioral Coupling in Face-to-Face Parent-Child Interactions.

    Science.gov (United States)

    López Pérez, David; Leonardi, Giuseppe; Niedźwiecka, Alicja; Radkowska, Alicja; Rączaszek-Leonardi, Joanna; Tomalski, Przemysław

    2017-01-01

    The analysis of parent-child interactions is crucial for the understanding of early human development. Manual coding of interactions is a time-consuming task, which is a limitation in many projects. This becomes especially demanding if a frame-by-frame categorization of movement needs to be achieved. To overcome this, we present a computational approach for studying movement coupling in natural settings, which is a combination of a state-of-the-art automatic tracker, Tracking-Learning-Detection (TLD), and nonlinear time-series analysis, Cross-Recurrence Quantification Analysis (CRQA). We investigated the use of TLD to extract and automatically classify movement of each partner from 21 video recordings of interactions, where 5.5-month-old infants and mothers engaged in free play in laboratory settings. As a proof of concept, we focused on those face-to-face episodes, where the mother animated an object in front of the infant, in order to measure the coordination between the infants' head movement and the mothers' hand movement. We also tested the feasibility of using such movement data to study behavioral coupling between partners with CRQA. We demonstrate that movement can be extracted automatically from standard definition video recordings and used in subsequent CRQA to quantify the coupling between movement of the parent and the infant. Finally, we assess the quality of this coupling using an extension of CRQA called anisotropic CRQA and show asymmetric dynamics between the movement of the parent and the infant. When combined these methods allow automatic coding and classification of behaviors, which results in a more efficient manner of analyzing movements than manual coding.

  3. Petroleum product refining: plant level analysis of costs and competitiveness. Implications of greenhouse gas emission reductions. Vol 1

    International Nuclear Information System (INIS)

    Kelly, S.J.; Crandall, G.R.; Houlton, G.A.; Kromm, R.B.

    1999-01-01

    Implications on the Canadian refining industry of reducing greenhouse gas (GHG) emissions to meet Canada's Kyoto commitment are assessed, based on a plant-level analysis of costs, benefits and economic and competitive impacts. It was determined on the basis of demand estimates prepared by Natural Resources Canada that refining industry carbon dioxide emissions could be as much a 38 per cent higher than 1990 levels in 2010. Achieving a six per cent reduction below 1990 levels from this business-as-usual case is considered a very difficult target to achieve, unless refinery shutdowns occur. This would require higher imports to meet Canada's petroleum products demand, leaving total carbon dioxide emissions virtually unchanged. A range of options, classified as (1) low capital, operating efficiency projects, (2) medium capital, process/utility optimization projects, (3) high capital, refinery specific projects, and (4) high operating cost GHG projects, were evaluated. Of these four alternatives, the low capital or operating efficiency projects were the only ones judged to have the potential to be economically viable. Energy efficiency projects in these four groups were evaluated under several policy initiatives including accelerated depreciation and a $200 per tonne of carbon tax. Result showed that an accelerated depreciation policy would lower the hurdle rate for refinery investments, and could achieve a four per cent reduction in GHG emissions below 1990 levels, assuming no further shutdown of refinery capacity. The carbon tax was judged to be potentially damaging to the Canadian refinery industry since it would penalize cracking refineries (most Canadian refineries are of this type); it would provide further uncertainty and risk, such that industry might not be able to justify investments to reduce emissions. The overall assessment is that the Canadian refinery industry could not meet the pro-rata Kyoto GHG reduction target through implementation of economically

  4. Trend analysis of nuclear reactor automatic trip events subjected to operator's human error at United States nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2009-01-01

    Trends in nuclear reactor automatic trip events due to human errors during plant operating mode have been analyzed by extracting 20 events which took place in the United States during the period of seven years from 2002 to 2008, cited in the LERs (Licensee Event Reports) submitted to the US Nuclear Regulatory Commission (NRC). It was shown that the yearly number of events was relatively large before 2005, and thereafter the number decreased. A period of stable operation, in which the yearly number was kept very small, continued for about three years, and then the yearly number turned to increase again. Before 2005, automatic trip events occurred more frequently during periodic inspections or start-up/shut-down operations. The recent trends, however, indicate that trip events became more frequent due to human errors during daily operations. Human errors were mostly caused by the self-conceit and carelessness of operators through the whole period. The before mentioned trends in the yearly number of events might be explained as follows. The decrease in the automatic trip events is attributed to sharing trouble information, leading as a consequence to improvement of the manual and training for the operations which have a higher potential risk of automatic trip. Then, while the period of stable operation continued, some operators came to pay less attention to preventing human errors and not interest in the training, leading to automatic trip events in reality due to miss-operation. From these analyses on trouble experiences in the US, we learnt the followings to prevent the occurrence similar troubles in Japan: Operators should be thoroughly skilled in basic actions to prevent human errors as persons concerned. And it should be further emphasized that they should elaborate by imaging actual plant operations even though the simulator training gives them successful experiences. (author)

  5. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer.

    Science.gov (United States)

    Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E

    2013-06-07

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.

  6. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Bourret, S.C.

    1974-01-01

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  7. Contribution to automatic speech recognition. Analysis of the direct acoustical signal. Recognition of isolated words and phoneme identification

    International Nuclear Information System (INIS)

    Dupeyrat, Benoit

    1981-01-01

    This report deals with the acoustical-phonetic step of the automatic recognition of the speech. The parameters used are the extrema of the acoustical signal (coded in amplitude and duration). This coding method, the properties of which are described, is simple and well adapted to a digital processing. The quality and the intelligibility of the coded signal after reconstruction are particularly satisfactory. An experiment for the automatic recognition of isolated words has been carried using this coding system. We have designed a filtering algorithm operating on the parameters of the coding. Thus the characteristics of the formants can be derived under certain conditions which are discussed. Using these characteristics the identification of a large part of the phonemes for a given speaker was achieved. Carrying on the studies has required the development of a particular methodology of real time processing which allowed immediate evaluation of the improvement of the programs. Such processing on temporal coding of the acoustical signal is extremely powerful and could represent, used in connection with other methods an efficient tool for the automatic processing of the speech.(author) [fr

  8. Proceedings of the Seventh Conference of Nuclear Sciences and Applications. Vol.1,2,3

    International Nuclear Information System (INIS)

    Aly, H.F.

    2000-01-01

    The publication has been set up as a textbook for nuclear sciences and applications vol.1: (1) radiochemistry; (2) radiation chemistry; (3) isotope production; (4) waste management; vol.2: (1) nuclear and reactor; (2) physics; (3) plasma physics; (4) instrumentation and devices; (5) trace and ultra trace analysis; (6) environmental; vol.3: (1) radiation protection; (2) radiation health hazards; (3) nuclear safety; (4) biology; (5) agriculture

  9. Proceedings of the Seventh Conference of Nuclear Sciences and Applications. Vol.1,2,3

    Energy Technology Data Exchange (ETDEWEB)

    Aly, H F [ed.

    2000-07-01

    The publication has been set up as a textbook for nuclear sciences and applications vol.1: (1) radiochemistry; (2) radiation chemistry; (3) isotope production; (4) waste management; vol.2: (1) nuclear and reactor; (2) physics; (3) plasma physics; (4) instrumentation and devices; (5) trace and ultra trace analysis; (6) environmental; vol.3: (1) radiation protection; (2) radiation health hazards; (3) nuclear safety; (4) biology; (5) agriculture.

  10. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR type reactor

    International Nuclear Information System (INIS)

    Benoist, P.; David, B.; Pigeon, M.

    1990-01-01

    An expert system for the automatic analysis of signals from Eddy currents is presented. The system was developed in order to detect and analyse the defects which may exist in vapor generators. The extraction of a signal from a high level background noise is possible. The organization of the work during the system's development, the results of the technique for the extraction of the signal from the background noise, and an example concerning the interpretation of the signal from a defect are presented [fr

  11. Vol 12, No 1 (2014)

    African Journals Online (AJOL)

    Egyptian Journal of Pediatric Allergy and Immunology (The) - Vol 12, No 1 (2014) ... The effect of serum angiotensin II and angiotensin II type 1 receptor gene ... with diabetic ketoacidosis · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  12. Zede Journal - Vol 29 (2012)

    African Journals Online (AJOL)

    Journal Home > Archives > Vol 29 (2012) ... Leakage aware hardware and stochastic power scheduling for smart mobile devices ... Energy efficient topology for Wireless Mesh Networks · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  13. 基于UML的自动组卷系统的分析与设计%Design and Analysis of Automatic Generation of Test Paper System Based on UML

    Institute of Scientific and Technical Information of China (English)

    刘慧梅

    2012-01-01

      本文运用基于面向对象的建模语言 UML 对自动组卷系统进行分析和设计,建立了自动组卷系统分析设计模型,为实现自动组卷系统打下良好的基础。%  this paper analyzed and designed the automatic generation of test paper system by UML, construct the design model based on analysis of the automatic generation of test paper system, and fully prepare for the realization of the automatic generation of test paper system.

  14. A Noise-Assisted Data Analysis Method for Automatic EOG-Based Sleep Stage Classification Using Ensemble Learning

    DEFF Research Database (Denmark)

    Olesen, Alexander Neergaard; Christensen, Julie Anja Engelhard; Sørensen, Helge Bjarup Dissing

    2016-01-01

    Reducing the number of recording modalities for sleep staging research can benefit both researchers and patients, under the condition that they provide as accurate results as conventional systems. This paper investigates the possibility of exploiting the multisource nature of the electrooculography...... (EOG) signals by presenting a method for automatic sleep staging using the complete ensemble empirical mode decomposition with adaptive noise algorithm, and a random forest classifier. It achieves a high overall accuracy of 82% and a Cohen’s kappa of 0.74 indicating substantial agreement between...

  15. Automatic gallbladder segmentation using combined 2D and 3D shape features to perform volumetric analysis in native and secretin-enhanced MRCP sequences.

    Science.gov (United States)

    Gloger, Oliver; Bülow, Robin; Tönnies, Klaus; Völzke, Henry

    2017-11-24

    We aimed to develop the first fully automated 3D gallbladder segmentation approach to perform volumetric analysis in volume data of magnetic resonance (MR) cholangiopancreatography (MRCP) sequences. Volumetric gallbladder analysis is performed for non-contrast-enhanced and secretin-enhanced MRCP sequences. Native and secretin-enhanced MRCP volume data were produced with a 1.5-T MR system. Images of coronal maximum intensity projections (MIP) are used to automatically compute 2D characteristic shape features of the gallbladder in the MIP images. A gallbladder shape space is generated to derive 3D gallbladder shape features, which are then combined with 2D gallbladder shape features in a support vector machine approach to detect gallbladder regions in MRCP volume data. A region-based level set approach is used for fine segmentation. Volumetric analysis is performed for both sequences to calculate gallbladder volume differences between both sequences. The approach presented achieves segmentation results with mean Dice coefficients of 0.917 in non-contrast-enhanced sequences and 0.904 in secretin-enhanced sequences. This is the first approach developed to detect and segment gallbladders in MR-based volume data automatically in both sequences. It can be used to perform gallbladder volume determination in epidemiological studies and to detect abnormal gallbladder volumes or shapes. The positive volume differences between both sequences may indicate the quantity of the pancreatobiliary reflux.

  16. Analysis of hydrogen and methane in seawater by "Headspace" method: Determination at trace level with an automatic headspace sampler.

    Science.gov (United States)

    Donval, J P; Guyader, V

    2017-01-01

    "Headspace" technique is one of the methods for the onboard measurement of hydrogen (H 2 ) and methane (CH 4 ) in deep seawater. Based on the principle of an automatic headspace commercial sampler, a specific device has been developed to automatically inject gas samples from 300ml syringes (gas phase in equilibrium with seawater). As valves, micro pump, oven and detector are independent, a gas chromatograph is not necessary allowing a reduction of the weight and dimensions of the analytical system. The different steps from seawater sampling to gas injection are described. Accuracy of the method is checked by a comparison with the "purge and trap" technique. The detection limit is estimated to 0.3nM for hydrogen and 0.1nM for methane which is close to the background value in deep seawater. It is also shown that this system can be used to analyze other gases such as Nitrogen (N 2 ), carbon monoxide (CO), carbon dioxide (CO 2 ) and light hydrocarbons. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Quantitative Analysis of Heavy Metals in Water Based on LIBS with an Automatic Device for Sample Preparation

    International Nuclear Information System (INIS)

    Hu Li; Zhao Nanjing; Liu Wenqing; Meng Deshuo; Fang Li; Wang Yin; Yu Yang; Ma Mingjun

    2015-01-01

    Heavy metals in water can be deposited on graphite flakes, which can be used as an enrichment method for laser-induced breakdown spectroscopy (LIBS) and is studied in this paper. The graphite samples were prepared with an automatic device, which was composed of a loading and unloading module, a quantitatively adding solution module, a rapid heating and drying module and a precise rotating module. The experimental results showed that the sample preparation methods had no significant effect on sample distribution and the LIBS signal accumulated in 20 pulses was stable and repeatable. With an increasing amount of the sample solution on the graphite flake, the peak intensity at Cu I 324.75 nm accorded with the exponential function with a correlation coefficient of 0.9963 and the background intensity remained unchanged. The limit of detection (LOD) was calculated through linear fitting of the peak intensity versus the concentration. The LOD decreased rapidly with an increasing amount of sample solution until the amount exceeded 20 mL and the correlation coefficient of exponential function fitting was 0.991. The LOD of Pb, Ni, Cd, Cr and Zn after evaporating different amounts of sample solution on the graphite flakes was measured and the variation tendency of their LOD with sample solution amounts was similar to the tendency for Cu. The experimental data and conclusions could provide a reference for automatic sample preparation and heavy metal in situ detection. (paper)

  18. Grid Generating Automatically Method of Finite Element Analysis for Rotator Structure%旋转体结构有限元网格自动划分法

    Institute of Scientific and Technical Information of China (English)

    许贤泽

    2000-01-01

    The finite element analysis is applied to structure and freedom-system analysis. Its grid generating method is important to the finite element modeling,which generates the grid automatically by the sectional division method and gets the finite element grid model, thus accomplishing the pre-work of the finite element analysis.%用有限元法对进行结构和自由度体系进行分析,其网格的生成是建立有限元模型的重要技术,利用分块分割法对网格自动划分,从而形成有限元网格模型,完成有限元分析的前处理。

  19. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  20. Development of automatic extraction of the corpus callosum from magnetic resonance imaging of the head and examination of the early dementia objective diagnostic technique in feature analysis

    International Nuclear Information System (INIS)

    Kodama, Naoki; Kaneko, Tomoyuki

    2005-01-01

    We examined the objective diagnosis of dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 17 early dementia patients (2 men and 15 women; mean age, 77.2±3.3 years) and 18 healthy elderly controls (2 men and 16 women; mean age, 73.8±6.5 years), 35 subjects altogether. First, the corpus callosum was automatically extracted from the MR images. Next, early dementia was compared with the healthy elderly individuals using 5 features of the straight-line methods, 5 features of the Run-Length Matrix, and 6 features of the Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum showed an accuracy rate of 84.1±3.7%. A statistically significant difference was found in 6 of the 16 features between early dementia patients and healthy elderly controls. Discriminant analysis using the 6 features demonstrated a sensitivity of 88.2% and specificity of 77.8%, with an overall accuracy of 82.9%. These results indicate that feature analysis based on changes in the corpus callosum can be used as an objective diagnostic technique for early dementia. (author)

  1. Automatic extraction of corpus callosum from midsagittal head MR image and examination of Alzheimer-type dementia objective diagnostic system in feature analysis

    International Nuclear Information System (INIS)

    Kaneko, Tomoyuki; Kodama, Naoki; Kaeriyama, Tomoharu; Fukumoto, Ichiro

    2004-01-01

    We studied the objective diagnosis of Alzheimer-type dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 40 Alzheimer-type dementia patients (15 men and 25 women; mean age, 75.4±5.5 years) and 31 healthy elderly persons (10 men and 21 women; mean age, 73.4±7.5 years), 71 subjects altogether. First, the corpus callosum was automatically extracted from midsagittal head MR images. Next, Alzheimer-type dementia was compared with the healthy elderly individuals using the features of shape factor and six features of Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum succeeded in 64 of 71 individuals, for an extraction rate of 90.1%. A statistically significant difference was found in 7 of the 9 features between Alzheimer-type dementia patients and the healthy elderly adults. Discriminant analysis using the 7 features demonstrated a sensitivity rate of 82.4%, specificity of 89.3%, and overall accuracy of 85.5%. These results indicated the possibility of an objective diagnostic system for Alzheimer-type dementia using feature analysis based on change in the corpus callosum. (author)

  2. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  3. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    examined, which in turn leads to any of the known stereological estimates, including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...... geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator...

  4. Automatic extraction analysis of the anatomical functional area for normal brain 18F-FDG PET imaging

    International Nuclear Information System (INIS)

    Guo Wanhua; Jiang Xufeng; Zhang Liying; Lu Zhongwei; Li Peiyong; Zhu Chengmo; Zhang Jiange; Pan Jiapu

    2003-01-01

    Using self-designed automatic extraction software of brain functional area, the grey scale distribution of 18 F-FDG imaging and the relationship between the 18 F-FDG accumulation of brain anatomic function area and the 18 F-FDG injected dose, the level of glucose, the age, etc., were studied. According to the Talairach coordinate system, after rotation, drift and plastic deformation, the 18 F-FDG PET imaging was registered into the Talairach coordinate atlas, and then the average gray value scale ratios between individual brain anatomic functional area and whole brain area was calculated. Further more the statistics of the relationship between the 18 F-FDG accumulation of every brain anatomic function area and the 18 F-FDG injected dose, the level of glucose and the age were tested by using multiple stepwise regression model. After images' registration, smoothing and extraction, main cerebral cortex of the 18 F-FDG PET brain imaging can be successfully localized and extracted, such as frontal lobe, parietal lobe, occipital lobe, temporal lobe, cerebellum, brain ventricle, thalamus and hippocampus. The average ratios to the inner reference of every brain anatomic functional area were 1.01 ± 0.15. By multiple stepwise regression with the exception of thalamus and hippocampus, the grey scale of all the brain functional area was negatively correlated to the ages, but with no correlation to blood sugar and dose in all areas. To the 18 F-FDG PET imaging, the brain functional area extraction program could automatically delineate most of the cerebral cortical area, and also successfully reflect the brain blood and metabolic study, but extraction of the more detailed area needs further investigation

  5. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  6. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  7. Contribution to the automatic measurement of bubble chamber pictures and the multidimensional analysis of K-p interactions at 14.3 GeV/c

    International Nuclear Information System (INIS)

    Lutz, P.

    1977-01-01

    The first part of this study is a detailed study of the different sources of distorsions of the flying spot digitizers CRT Paris which measure automatically the bubble chamber pictures, followed by the methods of calibration which correct the distorsions. An original study of the channel K - p→K - pπ + π - , lying on the direct search of the existing structures in the probability density in the phase space is developed in the second part. A first approach consists in a 'projection pursuit' algorithm and a second one is a more direct analysis of the density function (cluster finding). The results of a partial wave analysis of the K - diffraction are presented [fr

  8. Analysis of an Advanced Test Reactor Small-Break Loss-of-Coolant Accident with an Engineered Safety Feature to Automatically Trip the Primary Coolant Pumps

    International Nuclear Information System (INIS)

    Polkinghorne, Steven T.; Davis, Cliff B.; McCracken, Richard T.

    2000-01-01

    A new engineered safety feature that automatically trips the primary coolant pumps following a low-pressure reactor scram was recently installed in the Advanced Test Reactor (ATR). The purpose of this engineered safety feature is to prevent the ATR's surge tank, which contains compressed air, from emptying during a small-break loss-of-coolant accident (SBLOCA). If the surge tank were to empty, the air introduced into the primary coolant loop could potentially cause the performance of the primary and/or emergency coolant pumps to degrade, thereby reducing core thermal margins. Safety analysis performed with the RELAP5 thermal-hydraulic code and the SINDA thermal analyzer shows that adequate thermal margins are maintained during an SBLOCA with the new engineered safety feature installed. The analysis also shows that the surge tank will not empty during an SBLOCA even if one of the primary coolant pumps fails to trip

  9. TU-H-CAMPUS-JeP2-05: Can Automatic Delineation of Cardiac Substructures On Noncontrast CT Be Used for Cardiac Toxicity Analysis?

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y; Liao, Z; Jiang, W; Gomez, D; Williamson, R; Court, L; Yang, J [MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD) to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the clinical

  10. TU-H-CAMPUS-JeP2-05: Can Automatic Delineation of Cardiac Substructures On Noncontrast CT Be Used for Cardiac Toxicity Analysis?

    International Nuclear Information System (INIS)

    Luo, Y; Liao, Z; Jiang, W; Gomez, D; Williamson, R; Court, L; Yang, J

    2016-01-01

    Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD) to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the clinical

  11. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  12. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  13. STECH VOL5 (1) FEBRUARY, 2016

    African Journals Online (AJOL)

    Copyright 1AARR 2012-2016: www.afrrevjo.net

    STECH VOL 5 (1) FEBRUARY, 2016. Vol. 5 (1), S/No11, February, 2016: 1-13 ..... Knowledge produce is an act of discovery which involves exploring, analyzing .... Architectural Research, Elsevier: Higher Education Press Limited Company.

  14. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    Science.gov (United States)

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design.

  15. Development of an automatic scanning system for nuclear emulsion analysis in the OPERA experiment and study of neutrino interactions location

    International Nuclear Information System (INIS)

    Arrabito, L.

    2007-10-01

    Following Super Kamiokande and K2K experiments, Opera (Oscillation Project with Emulsion tracking Apparatus), aims to confirm neutrino oscillation in the atmospheric sector. Taking advantage of a technique already employed in Chorus and in Donut, the Emulsion Cloud Chamber (ECC), Opera will be able to observe the ν μ → ν τ oscillation, through the ν τ appearance in a pure ν μ beam. The Opera experiment, with its ∼ 100000 m 2 of nuclear emulsions, needs a very fast automatic scanning system. Optical and mechanics components have been customized in order to achieve a speed of about 20 cm 2 /hour per emulsion layer (44 μm thick), while keeping a sub-micro-metric resolution. The first part of this thesis was dedicated to the optimization of 4 scanning systems at the French scanning station, based in Lyon. An experimental study on a dry objective scanning system has also been realized. The obtained results show that the performances of dry scanning are similar with respect to the traditional oil scanning, so that it can be successfully used for Opera. The second part of this work was devoted to the study of the neutrino interaction location and reconstruction strategy actually used in Opera. A dedicated test beam was performed at CERN in order to simulate Opera conditions. The obtained results definitely confirm that the proposed strategy is well adapted for tau search. (author)

  16. Semi-automatic image analysis methodology for the segmentation of bubbles and drops in complex dispersions occurring in bioreactors

    Science.gov (United States)

    Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.

    2006-09-01

    Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.

  17. Automatic inverse methods for the analysis of pulse tests: application to four pulse tests at the Leuggern borehole

    International Nuclear Information System (INIS)

    Carrera, J.; Samper, J.; Vives, L.; Kuhlmann, U.

    1989-07-01

    Four pulse tests performed for NAGRA at the Leuggern borehole were analyzed using automatic test sequence matching techniques. Severe identifiability problems were unveiled during the process. Because of these problems, the identifiability of aquifer parameters (hydraulic conductivity, storativity and skin conductivity) from pulse tests similar to those performed in the Leuggern borehole was studied in two synthetic examples. The first of these had a positive skin effect and the second had a negative skin effect. These synthetic examples showed that, for the test conditions at the Leuggern borehole, estimating formation hydraulic conductivity may be nearly impossible for the cases of positive and negative skin factors. In addition, identifiability appears to be quite sensitive to the values of the parameters and to other factors such as skin thickness. Nevertheless, largely because of the manner in which the tests were conducted (i.e. relatively long injection time and performance of both injection and withdrawal), identifiability of the actual tests was much better than suggested by the synthetic examples. Only one of the four tests was nearly nonidentifiable. In all, the match between measured and computed aquifer responses was excellent for all the tests, and formation hydraulic conductivities were estimated within a relatively narrow uncertainty interval. (author) 19 refs., 59 figs., 28 tabs

  18. Contribution to automatic image recognition. Application to analysis of plain scenes of overlapping parts in robot technology

    International Nuclear Information System (INIS)

    Tan, Shengbiao

    1987-01-01

    A method for object modeling and overlapped object automatic recognition is presented. Our work is composed of three essential parts: image processing, object modeling, and evaluation of the implementation of the stated concepts. In the first part, we present a method of edge encoding which is based on a re-sampling of the data encoded according to Freeman, this method generates an isotropic, homogenous and very precise representation. The second part relates to object modeling. This important step makes much easier the recognition work. The new method proposed characterizes a model with two groups of information: the description group containing the primitives, the discrimination group containing data packs, called 'transition vectors'. Based on this original method of information organization, a 'relative learning' is able to select, to ignore and to update the information concerning the objects already learned, according to the new information to be included into the data base. The recognition is a two-pass process: the first pass determines very efficiently the presence of objects by making use of each object's particularities, and this hypothesis is either confirmed or rejected by the following fine verification pass. The last part describes in detail the experimentation results. We demonstrate the robustness of the algorithms with images in both poor lighting and overlapping objects conditions. The system, named SOFIA, has been installed into an industrial vision system series and works in real time. (author) [fr

  19. A comparison between the conventional manual ROI method and an automatic algorithm for semiquantitative analysis of SPECT studies

    International Nuclear Information System (INIS)

    Pagan, L; Novi, B; Guidarelli, G; Tranfaglia, C; Galli, S; Lucchi, G; Fagioli, G

    2011-01-01

    In this study, the performance of a free software for automatic segmentation of striatal SPECT brain studies (BasGanV2 - www.aimn.it) and a standard manual Region Of Interest (ROI) method were compared. The anthropomorphic Alderson RSD phantom, filled with solutions at different concentration of 123 I-FP-CIT with Caudate-Putamen to Background ratios between 1 and 8.7 and Caudate to Putamen ratios between 1 and 2, was imaged on a Philips-Irix triple head gamma camera. Images were reconstructed using filtered back-projection and processed with both BasGanV2, that provides normalized striatal uptake values on volumetric anatomical ROIs, and a manual method, based on average counts per voxel in ROIs drawn in a three-slice section. Caudate-Putamen/Background and Caudate/Putamen ratios obtained with the two methods were compared with true experimental ratios. Good correlation was found for each method; BasGanV2, however, has higher R index (BasGan R mean = 0.95, p mean = 0.89, p 123 I-FP-CIT SPECT data with, moreover, the advantage of the availability of a control subject's database.

  20. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  1. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  2. Automatic Conflict Detection on Contracts

    Science.gov (United States)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  3. Generalized synthetic aperture radar automatic target recognition by convolutional neural network with joint use of two-dimensional principal component analysis and support vector machine

    Science.gov (United States)

    Zheng, Ce; Jiang, Xue; Liu, Xingzhao

    2017-10-01

    Convolutional neural network (CNN), as a vital part of the deep learning research field, has shown powerful potential for automatic target recognition (ATR) of synthetic aperture radar (SAR). However, the high complexity caused by the deep structure of CNN makes it difficult to generalize. An improved form of CNN with higher generalization capability and less probability of overfitting, which further improves the efficiency and robustness of the SAR ATR system, is proposed. The convolution layers of CNN are combined with a two-dimensional principal component analysis algorithm. Correspondingly, the kernel support vector machine is utilized as the classifier layer instead of the multilayer perceptron. The verification experiments are implemented using the moving and stationary target acquisition and recognition database, and the results validate the efficiency of the proposed method.

  4. Large Scale Automatic Analysis and Classification of Roof Surfaces for the Installation of Solar Panels Using a Multi-Sensor Aerial Platform

    Directory of Open Access Journals (Sweden)

    Luis López-Fernández

    2015-09-01

    Full Text Available A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbor solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the areas, tilts, orientations and the existence of obstacles to locate the optimal zones inside each roof surface for the installation of solar panels. This information is complemented with the estimation of the solar irradiation received by each surface. This way, large areas may be efficiently analyzed obtaining as final result the optimal locations for the placement of solar panels as well as the information necessary (location, orientation, tilt, area and solar irradiation to estimate the productivity of a solar panel from its technical characteristics.

  5. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  6. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  7. Analysis of image factors of x-ray films : study for the intelligent replenishment system of automatic film processor

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sung Tae; Yoon, Chong Hyun; Park, Kwang BO; Auh, Yong Ho; Lee, Hyoung Jin; In, Kyung Hwan; Kim, Keon Chung [Asan Medical Center, Ulsan Univ. College of Medicine, Ulsan (Korea, Republic of)

    1998-06-01

    We analyzed image factors to determine the characteristic factors that need for intelligent replenishment system of the auto film processor. We processed the serial 300 sheets of radiographic films of chest phantom without replenishment of developing and fixation replenisher. We took the digital data by using film digitizer which scanned the films and automatically summed up the pixel values of the films. We analyzed characteristic curves, average gradients and relative speeds of individual film using densitometer and step densitometry. We also evaluated the pH of developer, fixer, and washer fluid with digital pH meter. Fixer residual rate and washing effect were measured by densitometer using the reagent methods. There was no significant reduction of the digital density numbers of the serial films without replenishment of developer and fixer. The average gradients were gradually decreased by 0.02 and relative speeds were also gradually decreased by 6.96% relative to initial standard step-densitometric measurement. The pHs of developer and fixer were reflected the inactivation of each fluid. The fixer residual rates and washing effects after processing each 25 sheets of films were in the normal range. We suggest that the digital data are not reliable due to limitation of the hardware and software of the film digitizer. We conclude that average gradient and relative speed which mean the film's contrast and sensitivity respectively are reliable factors for determining the need for the replenishment of the auto film processor. We need more study of simpler equations and programming for more intelligent replenishment system of the auto film processor.

  8. South African Journal of Education - Vol 31, No 1 (2011)

    African Journals Online (AJOL)

    South African Journal of Education - Vol 31, No 1 (2011) ... How an analysis of reviewers' reports can enhance the quality of submissions to a journal of education ... Doctoral learning: a case for a cohort model of supervision and support ...

  9. Automatic determination of brain perfusion index for measurement of cerebral blood flow using spectral analysis and {sup 99m}Tc-HMPAO

    Energy Technology Data Exchange (ETDEWEB)

    Takasawa, Masashi [Division of Strokology, Department of Internal Medicine and Therapeutics, Osaka University Graduate School of Medicine, 2-2, Yamadaoka, Suita City, Osaka, 565-0871 (Japan); Department of Nuclear Medicine, Osaka University Graduate School of Medicine, Osaka (Japan); Murase, Kenya; Kawamata, Minoru [Department of Allied Health Sciences, Osaka University Graduate School of Medicine, Osaka (Japan); Oku, Naohiko; Imaizumi, Masao; Osaki, Yasuhiro; Paul, Asit K. [Department of Nuclear Medicine, Osaka University Graduate School of Medicine, Osaka (Japan); Yoshikawa, Takuya; Kitagawa, Kazuo; Matsumoto, Masayasu; Hori, Masatsugu [Division of Strokology, Department of Internal Medicine and Therapeutics, Osaka University Graduate School of Medicine, 2-2, Yamadaoka, Suita City, Osaka, 565-0871 (Japan)

    2002-11-01

    Cerebral blood flow (CBF) can be non-invasively quantified using the brain perfusion index (BPI), determined from radionuclide angiographic data generated by technetium-99m hexamethylpropylene amine oxime ({sup 99m}Tc-HMPAO). We previously reported the use of a spectral analysis (SA) method using {sup 99m}Tc-HMPAO to calculate the BPI. In this report, we demonstrate an automatic method for determining the optimal BPI value and compare the optimal BPI values with the absolute CBF values measured using H{sub 2}{sup 15}O positron emission tomography (PET). Bilateral cerebral hemispheres of 11 patients with various brain diseases were examined using {sup 99m}Tc-HMPAO. In the automatic SA procedure, the radioactivity curve for the aortic arch (C{sub a}) was shifted by 0-10 s. The radioactivity curve for the brain (C{sub b}) was estimated using the shifted C{sub a}, and the error value between the actually measured and the estimated C{sub b} (Err) was calculated. When the Err was at a minimum, the BPI value was defined as optimal BPI. The difference in BPI from the optimal BPI was calculated as vertical stroke BPI - optimal BPI vertical stroke / optimal BPI x 100 (%). In all participants, an H{sub 2}{sup 15}O PET examination was also performed, and the BPI values were compared with the absolute CBF values measured using H{sub 2}{sup 15}O PET (mCBF{sup PET}). The difference between BPI and the optimal BPI increased significantly from 4.87%{+-}1.69% to 18.38%{+-}3.93% (mean{+-}SD) when the Err value increased. The optimal BPI value (y) was well correlated with the mCBF{sup PET} value (x) (y=0.21x-0.0075, r=0.800). Our results suggest that this automatic SA method provides an accurate estimate of BPI that can be used for the quantification of CBF using {sup 99m}Tc-HMPAO SA. (orig.)

  10. Automatic and integrated micro-enzyme assay (AIμEA) platform for highly sensitive thrombin analysis via an engineered fluorescence protein-functionalized monolithic capillary column.

    Science.gov (United States)

    Lin, Lihua; Liu, Shengquan; Nie, Zhou; Chen, Yingzhuang; Lei, Chunyang; Wang, Zhen; Yin, Chao; Hu, Huiping; Huang, Yan; Yao, Shouzhuo

    2015-04-21

    Nowadays, large-scale screening for enzyme discovery, engineering, and drug discovery processes require simple, fast, and sensitive enzyme activity assay platforms with high integration and potential for high-throughput detection. Herein, a novel automatic and integrated micro-enzyme assay (AIμEA) platform was proposed based on a unique microreaction system fabricated by a engineered green fluorescence protein (GFP)-functionalized monolithic capillary column, with thrombin as an example. The recombinant GFP probe was rationally engineered to possess a His-tag and a substrate sequence of thrombin, which enable it to be immobilized on the monolith via metal affinity binding, and to be released after thrombin digestion. Combined with capillary electrophoresis-laser-induced fluorescence (CE-LIF), all the procedures, including thrombin injection, online enzymatic digestion in the microreaction system, and label-free detection of the released GFP, were integrated in a single electrophoretic process. By taking advantage of the ultrahigh loading capacity of the AIμEA platform and the CE automatic programming setup, one microreaction column was sufficient for many times digestion without replacement. The novel microreaction system showed significantly enhanced catalytic efficiency, about 30 fold higher than that of the equivalent bulk reaction. Accordingly, the AIμEA platform was highly sensitive with a limit of detection down to 1 pM of thrombin. Moreover, the AIμEA platform was robust and reliable to detect thrombin in human serum samples and its inhibition by hirudin. Hence, this AIμEA platform exhibits great potential for high-throughput analysis in future biological application, disease diagnostics, and drug screening.

  11. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  12. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  13. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  14. Radioanalytical chemistry. Vol. 2

    International Nuclear Information System (INIS)

    Toelgyessy, J.; Kyrs, M.

    1989-01-01

    This volume of the monograph covers the following topics: activation analysis, non-activation interaction analysis (elastic scattering of charged particles, absorption and backscattering of beta radiation and photons, radionuclide X-ray fluorescence analysis, thermalization, scattering and absorption of neutrons, use of ionization caused by nuclear radiation, use of ionization by alpha or beta radiation for the measurement of pressure, density and flow rate of gases), and automation in radioanalytical chemistry. (P.A.)

  15. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  16. Readability index as a design criterion for elicited imitation tasks in automatic oral proficiency assessment

    CSIR Research Space (South Africa)

    De Wet, Febe

    2011-08-01

    Full Text Available ) techniques to automatically assess oral proficiency and listen- ing comprehension is one way in which these logistical prob- lems can be obviated. Another appealing feature of automatic tests is that they provide a means to assess consistently and ob...?146, 2008. [2] F. De Wet, C. Van der Walt, and T. R. Niesler, ?Automatic assess- ment of oral language proficiency and listening comprehension,? Speech Communication, vol. 51, pp. 864?874, 2009. [3] C. R. Graham, D. Lonsdale, C. Kennington, A. Johnson...

  17. Development of an optimized procedure bridging design and structural analysis codes for the automatized design of the SMART

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Park, Keun Bae; Choi, Suhn; Kim, Kang Soo; Jeong, Kyeong Hoon; Lee, Gyu Mahn

    1998-09-01

    In this report, an optimized design and analysis procedure is established to apply to the SMART (System-integrated Modular Advanced ReacTor) development. The development of an optimized procedure is to minimize the time consumption and engineering effort by squeezing the design and feedback interactions. To achieve this goal, the data and information generated through the design development should be directly transferred to the analysis program with minimum operation. The verification of the design concept requires considerable effort since the communication between the design and analysis involves time consuming stage for the conversion of input information. In this report, an optimized procedure is established bridging the design and analysis stage utilizing the IDEAS, ABAQUS and ANSYS. (author). 3 refs., 2 tabs., 5 figs

  18. Sensitivity Range Analysis of Infrared (IR) Transmitter and Receiver Sensor to Detect Sample Position in Automatic Sample Changer

    International Nuclear Information System (INIS)

    Syirrazie Che Soh; Nolida Yussup; Nur Aira Abdul Rahman; Maslina Ibrahim

    2016-01-01

    Sensitivity range of IR Transmitter and Receiver Sensor influences the effectiveness of the sensor to detect position of a sample. Then the purpose of this analysis is to determine the suitable design and specification the electronic driver of the sensor to gain appropriate sensitivity range for required operation. The related activities to this analysis cover electronic design concept and specification, calibration of design specification and evaluation on design specification for required application. (author)

  19. The Reliability of Technical and Tactical Tagging Analysis Conducted by a Semi-Automatic VTS in Soccer.

    Science.gov (United States)

    Beato, Marco; Jamil, Mikael; Devereux, Gavin

    2018-06-01

    The Video Tracking multiple cameras system (VTS) is a technology that records two-dimensional position data (x and y) at high sampling rates (over 25 Hz). The VTS is of great interest because it can record external load variables as well as collect technical and tactical parameters. Performance analysis is mainly focused on physical demands, yet less attention has been afforded to technical and tactical factors. Digital.Stadium® VTS is a performance analysis device widely used at national and international levels (i.e. Italian Serie A, Euro 2016) and the reliability evaluation of its technical tagging analysis (e.g. shots, passes, assists, set pieces) could be paramount for its application at elite level competitions, as well as in research studies. Two professional soccer teams, with 30 male players (age 23 ± 5 years, body mass 78.3 ± 6.9 kg, body height 1.81 ± 0.06 m), were monitored in the 2016 season during a friendly match and data analysis was performed immediately after the game ended. This process was then replicated a week later (4 operators conducted the data analysis in each week). This study reports a near perfect relationship between Match and its Replication. R2 coefficients (relationships between Match and Replication) were highly significant for each of the technical variables considered (p technical tagging data accurately.

  20. Egyptian Journal of Medical Human Genetics - Vol 14, No 3 (2013)

    African Journals Online (AJOL)

    Egyptian Journal of Medical Human Genetics - Vol 14, No 3 (2013) ... Comparative study: Parameters of gait in Down syndrome versus matched obese and ... episodes in a Japanese child: Clinical, radiological and molecular genetic analysis ...

  1. About water. Vol. 81

    International Nuclear Information System (INIS)

    1993-01-01

    The conference papers discuss the following subjects: pollution of soil and groundwater from landfills or agricultural fertilizers, waste water analysis and treatment, water treatment technology, analytical procedures. (EF) [de

  2. Analysis of Chi-square Automatic Interaction Detection (CHAID) and Classification and Regression Tree (CRT) for Classification of Corn Production

    Science.gov (United States)

    Susanti, Yuliana; Zukhronah, Etik; Pratiwi, Hasih; Respatiwulan; Sri Sulistijowati, H.

    2017-11-01

    To achieve food resilience in Indonesia, food diversification by exploring potentials of local food is required. Corn is one of alternating staple food of Javanese society. For that reason, corn production needs to be improved by considering the influencing factors. CHAID and CRT are methods of data mining which can be used to classify the influencing variables. The present study seeks to dig up information on the potentials of local food availability of corn in regencies and cities in Java Island. CHAID analysis yields four classifications with accuracy of 78.8%, while CRT analysis yields seven classifications with accuracy of 79.6%.

  3. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  4. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis

    Czech Academy of Sciences Publication Activity Database

    Schafer, S.; Nylund, K.; Saevik, F.; Engjom, T.; Mézl, M.; Jiřík, Radovan; Dimcevski, G.; Gilja, O.H.; Tönnies, K.

    2015-01-01

    Roč. 63, AUG 1 (2015), s. 229-237 ISSN 0010-4825 R&D Projects: GA ČR GAP102/12/2380 Institutional support: RVO:68081731 Keywords : ultrasonography * motion analysis * motion compensation * registration * CEUS * contrast-enhanced ultrasound * perfusion * perfusion modeling Subject RIV: FS - Medical Facilities ; Equipment Impact factor: 1.521, year: 2015

  5. Development of automatic reactor vessel inspection systems: development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. H.; Lim, H. T.; Um, B. G. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine the reactor vessel weldsIn order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed in this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition and analysis software was developed. 11 refs., 6 figs., 9 tabs. (Author)

  6. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  7. Design, Construction and Effectiveness Analysis of Hybrid Automatic Solar Tracking System for Amorphous and Crystalline Solar Cells

    OpenAIRE

    Bhupendra Gupta

    2013-01-01

    - This paper concerns the design and construction of a Hybrid solar tracking system. The constructed device was implemented by integrating it with Amorphous & Crystalline Solar Panel, three dimensional freedom mechanism and microcontroller. The amount of power available from a photovoltaic panel is determined by three parameters, the type of solar tracker, materials of solar panel and the intensity of the sunlight. The objective of this paper is to present analysis on the use of two differ...

  8. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  9. Quantitative analysis of selected minor and trace elements through use of a computerized automatic x-ray spectrograph

    International Nuclear Information System (INIS)

    Fabbi, B.P.; Elsheimer, H.N.; Espos, L.F.

    1976-01-01

    Upgrading a manual X-ray spectrograph, interfacing with an 8K computer, and employment of interelement correction programs have resulted in a several-fold increase in productivity for routine quantitative analysis and an accompanying decrease in operator bias both in measurement procedures and in calculations. Factors such as dead time and self-absorption also are now computer corrected, resulting in improved accuracy. All conditions of analysis except for the X-ray tube voltage are controlled by the computer, which enhances precision of analysis. Elemental intensities are corrected for matrix effects, and from these the percent concentrations are calculated and printed via teletype. Interelement correction programs utilizing multiple linear regression are employed for the determination of the following minor and trace elements: K, S, Rb, Sr, Y, and Zr in silicate rocks, and Ba, As, Sb, and Zn in both silicate and carbonate rock samples. The last named elements use the same regression curves for both rock types. All these elements are determined in concentrations generally ranging from 0.0025 percent to 4.00 percent. The sensitivities obtainable range from 0.0001 percent for barium to 0.001 percent for antimony. The accuracy, as measured by the percent relative error for a variety of silicate and carbonate rocks, is on the order of 1-7 percent. The exception is yttrium

  10. Agrosearch 1 Vol. 1

    African Journals Online (AJOL)

    This paper examines the socio-economic status of women in group ... The results of the multiple regression analysis showed that the variables (age, ... Oluwasola and Alimi, (2007) found that one of the major problems of an average .... business or financial management and business record keeping skills resulting from their ...

  11. IJAAAR 2011 VOL 7

    African Journals Online (AJOL)

    Faculty of Agricultural Sciences Lautech Ogbomoso

    In improving indigenous sesame seed yield, there is reliability in selecting for number of capsule and seed per capsule as these traits recorded highest selection index using heritability and genetic advance parameters. Key words: Variation, selection index, indigenous sesame, year effect, path analysis. Introduction.

  12. Automatic screening of obstructive sleep apnea from the ECG based on empirical mode decomposition and wavelet analysis

    International Nuclear Information System (INIS)

    Mendez, M O; Cerutti, S; Bianchi, A M; Corthout, J; Van Huffel, S; Matteucci, M; Penzel, T

    2010-01-01

    This study analyses two different methods to detect obstructive sleep apnea (OSA) during sleep time based only on the ECG signal. OSA is a common sleep disorder caused by repetitive occlusions of the upper airways, which produces a characteristic pattern on the ECG. ECG features, such as the heart rate variability (HRV) and the QRS peak area, contain information suitable for making a fast, non-invasive and simple screening of sleep apnea. Fifty recordings freely available on Physionet have been included in this analysis, subdivided in a training and in a testing set. We investigated the possibility of using the recently proposed method of empirical mode decomposition (EMD) for this application, comparing the results with the ones obtained through the well-established wavelet analysis (WA). By these decomposition techniques, several features have been extracted from the ECG signal and complemented with a series of standard HRV time domain measures. The best performing feature subset, selected through a sequential feature selection (SFS) method, was used as the input of linear and quadratic discriminant classifiers. In this way we were able to classify the signals on a minute-by-minute basis as apneic or nonapneic with different best-subset sizes, obtaining an accuracy up to 89% with WA and 85% with EMD. Furthermore, 100% correct discrimination of apneic patients from normal subjects was achieved independently of the feature extractor. Finally, the same procedure was repeated by pooling features from standard HRV time domain, EMD and WA together in order to investigate if the two decomposition techniques could provide complementary features. The obtained accuracy was 89%, similarly to the one achieved using only Wavelet analysis as the feature extractor; however, some complementary features in EMD and WA are evident

  13. Automatic Classification of Users' Health Information Need Context: Logistic Regression Analysis of Mouse-Click and Eye-Tracker Data.

    Science.gov (United States)

    Pian, Wenjing; Khoo, Christopher Sg; Chi, Jianxing

    2017-12-21

    Users searching for health information on the Internet may be searching for their own health issue, searching for someone else's health issue, or browsing with no particular health issue in mind. Previous research has found that these three categories of users focus on different types of health information. However, most health information websites provide static content for all users. If the three types of user health information need contexts can be identified by the Web application, the search results or information offered to the user can be customized to increase its relevance or usefulness to the user. The aim of this study was to investigate the possibility of identifying the three user health information contexts (searching for self, searching for others, or browsing with no particular health issue in mind) using just hyperlink clicking behavior; using eye-tracking information; and using a combination of eye-tracking, demographic, and urgency information. Predictive models are developed using multinomial logistic regression. A total of 74 participants (39 females and 35 males) who were mainly staff and students of a university were asked to browse a health discussion forum, Healthboards.com. An eye tracker recorded their examining (eye fixation) and skimming (quick eye movement) behaviors on 2 types of screens: summary result screen displaying a list of post headers, and detailed post screen. The following three types of predictive models were developed using logistic regression analysis: model 1 used only the time spent in scanning the summary result screen and reading the detailed post screen, which can be determined from the user's mouse clicks; model 2 used the examining and skimming durations on each screen, recorded by an eye tracker; and model 3 added user demographic and urgency information to model 2. An analysis of variance (ANOVA) analysis found that users' browsing durations were significantly different for the three health information contexts

  14. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  15. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  16. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  17. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  18. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  19. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  20. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  1. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    Science.gov (United States)

    Bainbridge, Matthew B.; Webb, John K.

    2017-06-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  2. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  3. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  4. Automatic analysis of dividing cells in live cell movies to detect mitotic delays and correlate phenotypes in time.

    Science.gov (United States)

    Harder, Nathalie; Mora-Bermúdez, Felipe; Godinez, William J; Wünsche, Annelie; Eils, Roland; Ellenberg, Jan; Rohr, Karl

    2009-11-01

    Live-cell imaging allows detailed dynamic cellular phenotyping for cell biology and, in combination with small molecule or drug libraries, for high-content screening. Fully automated analysis of live cell movies has been hampered by the lack of computational approaches that allow tracking and recognition of individual cell fates over time in a precise manner. Here, we present a fully automated approach to analyze time-lapse movies of dividing cells. Our method dynamically categorizes cells into seven phases of the cell cycle and five aberrant morphological phenotypes over time. It reliably tracks cells and their progeny and can thus measure the length of mitotic phases and detect cause and effect if mitosis goes awry. We applied our computational scheme to annotate mitotic phenotypes induced by RNAi gene knockdown of CKAP5 (also known as ch-TOG) or by treatment with the drug nocodazole. Our approach can be readily applied to comparable assays aiming at uncovering the dynamic cause of cell division phenotypes.

  5. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  6. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    Science.gov (United States)

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness.

  7. Hard processes. Vol. 1

    International Nuclear Information System (INIS)

    Ioffe, B.L.; Khoze, V.A.; Lipatov, L.N.

    1984-01-01

    Deep inelastic (hard) processes are now at the epicenter of modern high-energy physics. These processes are governed by short-distance dynamics, which reveals the intrinsic structure of elementary particles. The theory of deep inelastic processes is now sufficiently well settled. The authors' aim was to give an effective tool to theoreticians and experimentalists who are engaged in high-energy physics. This book is intended primarily for physicists who are only beginning to study the field. To read the book, one should be acquainted with the Feynman diagram technique and with some particular topics from elementary particle theory (symmetries, dispersion relations, Regge pole theory, etc.). Theoretical consideration of deep inelastic processes is now based on quantum chromodynamics (QCD). At the same time, analysis of relevant physical phenomena demands a synthesis of QCD notions (quarks, gluons) with certain empirical characteristics. Therefore, the phenomenological approaches presented are a necessary stage in a study of this range of phenomena which should undoubtedly be followed by a detailed description based on QCD and electroweak theory. The authors were naturally unable to dwell on experimental data accumulated during the past decade of intensive investigations. Priority was given to results which allow a direct comparison with theoretical predictions. (Auth.)

  8. Automatic extraction of the cingulum bundle in diffusion tensor tract-specific analysis. Feasibility study in Parkinson's disease with and without dementia

    International Nuclear Information System (INIS)

    Ito, Kenji; Masutani, Yoshitaka; Suzuki, Yuichi; Ino, Kenji; Kunimatsu, Akira; Ohtomo, Kuni; Kamagata, Koji; Yasmin, Hasina; Aoki, Shigeki

    2013-01-01

    Tract-specific analysis (TSA) measures diffusion parameters along a specific fiber that has been extracted by fiber tracking using manual regions of interest (ROIs), but TSA is limited by its requirement for manual operation, poor reproducibility, and high time consumption. We aimed to develop a fully automated extraction method for the cingulum bundle (CB) and to apply the method to TSA in neurobehavioral disorders such as Parkinson's disease (PD). We introduce the voxel classification (VC) and auto diffusion tensor fiber-tracking (AFT) methods of extraction. The VC method directly extracts the CB, skipping the fiber-tracking step, whereas the AFT method uses fiber tracking from automatically selected ROIs. We compared the results of VC and AFT to those obtained by manual diffusion tensor fiber tracking (MFT) performed by 3 operators. We quantified the Jaccard similarity index among the 3 methods in data from 20 subjects (10 normal controls [NC] and 10 patients with Parkinson's disease dementia [PDD]). We used all 3 extraction methods (VC, AFT, and MFT) to calculate the fractional anisotropy (FA) values of the anterior and posterior CB for 15 NC subjects, 15 with PD, and 15 with PDD. The Jaccard index between results of AFT and MFT, 0.72, was similar to the inter-operator Jaccard index of MFT. However, the Jaccard indices between VC and MFT and between VC and AFT were lower. Consequently, the VC method classified among 3 different groups (NC, PD, and PDD), whereas the others classified only 2 different groups (NC, PD or PDD). For TSA in Parkinson's disease, the VC method can be more useful than the AFT and MFT methods for extracting the CB. In addition, the results of patient data analysis suggest that a reduction of FA in the posterior CB may represent a useful biological index for monitoring PD and PDD. (author)

  9. Recommended number of strides for automatic assessment of gait symmetry and regularity in above-knee amputees by means of accelerometry and autocorrelation analysis

    Directory of Open Access Journals (Sweden)

    Tura Andrea

    2012-02-01

    Full Text Available Abstract Background Symmetry and regularity of gait are essential outcomes of gait retraining programs, especially in lower-limb amputees. This study aims presenting an algorithm to automatically compute symmetry and regularity indices, and assessing the minimum number of strides for appropriate evaluation of gait symmetry and regularity through autocorrelation of acceleration signals. Methods Ten transfemoral amputees (AMP and ten control subjects (CTRL were studied. Subjects wore an accelerometer and were asked to walk for 70 m at their natural speed (twice. Reference values of step and stride regularity indices (Ad1 and Ad2 were obtained by autocorrelation analysis of the vertical and antero-posterior acceleration signals, excluding initial and final strides. The Ad1 and Ad2 coefficients were then computed at different stages by analyzing increasing portions of the signals (considering both the signals cleaned by initial and final strides, and the whole signals. At each stage, the difference between Ad1 and Ad2 values and the corresponding reference values were compared with the minimum detectable difference, MDD, of the index. If that difference was less than MDD, it was assumed that the portion of signal used in the analysis was of sufficient length to allow reliable estimation of the autocorrelation coefficient. Results All Ad1 and Ad2 indices were lower in AMP than in CTRL (P Conclusions Without the need to identify and eliminate the phases of gait initiation and termination, twenty strides can provide a reasonable amount of information to reliably estimate gait regularity in transfemoral amputees.

  10. Rhodium. Suppl. Vol. B1

    International Nuclear Information System (INIS)

    Griffith, W.P.; Jehn, H.; McCleverty, J.A.; Raub, C.J.; Robinson, S.D.

    1982-01-01

    The present rhodium vol. B1 is concerned largely with linary compounds and coordination complexes of this important metal, which is used either alone or in alloy form for fabrication of other materials or for heterogeneous catalysis. In first two chapters are devoted for hydrides, oxides, ternary and quaternary oxorhodates. Third chapter is on different type of complexes with nitrogen. From chapter four to seven is on halogen complexes with this metal. Next chapters are on sulphides, sulphoxide and sulphito complexes, sulphates and sulphato complexes, selenides and tellurides, borides, borane complexes, carbides, carbonato, cyno, fulminato and thiocyanato complexes. Finally, silicide, phosphides, phosphito and arsenides are treated over here. (AB)

  11. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  12. Joyo progress report, vol. 8

    International Nuclear Information System (INIS)

    1983-01-01

    Following Joyo Reactor Technology Progress Reports (Vol. 1 to Vol. 7), the name was changed to Joyo Progress Report from this volume, and the activities concerning the fast breeder experimental reactor Joyo as a whole are to be reported as quarterly report. In the fast breeder experimental reactor Joyo, the change to the core for irradiation (MK-2) from the core for breeding (MK-1) was carried out since January, 1982, in order to utilize the reactor as an irradiation facility for the development of fuel and materials. The main work was the construction of the core for irradiation by exchanging 290 fuel elements, and the exchange of upper and lower guide pipes for control rods, the reconstruction of the driving mechanism, the installation of standby neutron detector system, the acceptance and inspection of new fuel, and the transfer of spent fuel between pools were carried out. As scheduled, the core for irradiation attained the initial criticality on November 22, and the works of constructing the core were completed on December 23, 1982. Thereafter, the 100 MW performance test was begun. Various experience and valuable data were obtained in the regular inspection and the maintenance and repair works carried out at the same time, regarding the operation and maintenance of the Joyo facilities. (Kako, I.)

  13. Automatic analysis with thermometric detection.

    Science.gov (United States)

    McLean, W R; Penketh, G E

    1968-11-01

    The construction of a cell and associated Wheatstone bridge detector circuitry is described for a thermometric detector suitable for attachment to a Technicon Autoanalyzer. The detector produces a d.c. mV signal linearly proportional to the concentration (0.005-0.1M) of the thermally reactive component in the sample stream when it is mixed in the cell with the reagent stream. The influence of various pertinent parameters such as ambient temperature, thermistor voltage, heats of reaction and sensitivity are discussed together with interference effects arising through chemistry, ionic strength effects and heat of dilution.

  14. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    of the analyses carried out by hand is still well beyond what can be done ... meeting communication, it follows that multiple, asynchronous streams of .... enabled us to develop system-level evaluations (discussed further in section 4), at the cost of less ..... Stasser G, Taylor L 1991 Speaking turns in face-to-face discussions, ...

  15. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  16. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  17. A prediction rule for the development of delirium among patients in medical wards: Chi-Square Automatic Interaction Detector (CHAID) decision tree analysis model.

    Science.gov (United States)

    Kobayashi, Daiki; Takahashi, Osamu; Arioka, Hiroko; Koga, Shinichiro; Fukui, Tsuguya

    2013-10-01

    To predict development of delirium among patients in medical wards by a Chi-Square Automatic Interaction Detector (CHAID) decision tree model. This was a retrospective cohort study of all adult patients admitted to medical wards at a large community hospital. The subject patients were randomly assigned to either a derivation or validation group (2:1) by computed random number generation. Baseline data and clinically relevant factors were collected from the electronic chart. Primary outcome was the development of delirium during hospitalization. All potential predictors were included in a forward stepwise logistic regression model. CHAID decision tree analysis was also performed to make another prediction model with the same group of patients. Receiver operating characteristic curves were drawn, and the area under the curves (AUCs) were calculated for both models. In the validation group, these receiver operating characteristic curves and AUCs were calculated based on the rules from derivation. A total of 3,570 patients were admitted: 2,400 patients assigned to the derivation group and 1,170 to the validation group. A total of 91 and 51 patients, respectively, developed delirium. Statistically significant predictors were delirium history, age, underlying malignancy, and activities of daily living impairment in CHAID decision tree model, resulting in six distinctive groups by the level of risk. AUC was 0.82 in derivation and 0.82 in validation with CHAID model and 0.78 in derivation and 0.79 in validation with logistic model. We propose a validated CHAID decision tree prediction model to predict the development of delirium among medical patients. Copyright © 2013 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  18. Madden-Julian Oscillation (MJO) Signal over Kototabang, West Sumatera Based on the Mini Automatic Weather Station (MAWS) Data Analysis Using the Wavelet Technique

    Science.gov (United States)

    Hermawan, E.

    2018-04-01

    This study is mainly concerned an application of Mini Automatic Weather Station (MAWS) at Kototabang, West Sumatera nearby the location of an Equatorial Atmosphere Radar (EAR) side. We are interest to use this data to investigate the propagation of the Madden-Julian Oscillation (MJO). We examined of daily MAWS data for 3 years observations started from January 2001 to Mei 2004. By applying wavelet analysis, we found the MJO at Kototabang have 32 days oscillations as shown in Fig.1 below. In this study, we concentrate just for local mechanis only. We will show in this paper that at the phase of the MJO with a dipole structure to the convection anomalies, there is enhanced tropical convection over the eastern Indian Ocean and reduced convection over the western Pacific. Over the equatorial western Indian Ocean, the equatorial Rossby wave response to the west of the enhanced convection includes a region of anomalous surface divergence associated with the anomalous surface westerlies and pressure ridge. This tends to suppress ascent in the boundary layer and shuts off the deep convection, eventually leading to a convective anomaly of the opposite sign. Over the Indonesian sector, the equatorial Kelvin wave response to the east of the enhanced convection includes a region of anomalous surface convergence into the anomalous equatorial surface easterlies and pressure trough, which will tend to favour convection in this region. The Indonesian sector is also influenced by an equatorial Rossby wave response (of opposite sign) to the west of the reduced convection over the western Pacific, which also has a region of anomalous surface convergence associated with its anomalous equatorial surface easterlies and pressure trough. Hence, convective anomalies of either sign tend to erode themselves from the west and initiate a convective anomaly of opposite sign via their equatorial Rossby wave response, and expand to the east via their equatorial Kelvin wave response.

  19. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR, type reactor

    International Nuclear Information System (INIS)

    Lefevre, F.; Baumaire, A.; Comby, R.; Benas, J.C.

    1990-01-01

    The automatization of the monitoring of the steam generator tubes required some developments in the field of data processing. The monitoring is performed by means of Eddy current tests. Improvements in signal processing and in pattern recognition associated to the artificial intelligence techniques induced EDF (French Electricity Company) to develop an automatic signal processing system. The system, named EXTRACSION (French acronym for Expert System for the Processing and classification of Signals of Nuclear Nature), insures the coherence between the different fields of knowledge (metallurgy, measurement, signals) during data processing by applying an object oriented representation [fr

  20. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  1. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  2. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  3. Automatic sweep circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input is described. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found

  4. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  5. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  6. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  7. Background levels of some trace elements in sandy soil of Abou-Zabal, and its variation with soil depth determines by neutron activation analysis. Vol. 4

    International Nuclear Information System (INIS)

    Abdel-Sabour, M.F.; Sanad, W.; Flex, H.; Abdel-Haleem, A.S.; Zohny, E.

    1996-01-01

    The variation in soil total heavy metal contents (horizontally and vertically) in small land area (about one acre) was investigated using neutron activities analysis technique. The background levels found in the sandy soil of Abou-Zabal are also discussed in relation to the findings of other workers. 5 tabs

  8. Background levels of some trace elements in sandy soil of Abou-Zabal, and its variation with soil depth determines by neutron activation analysis. Vol. 4.

    Energy Technology Data Exchange (ETDEWEB)

    Abdel-Sabour, M F [Soil Pollution Unit, Soil and Water Department. Nuclear Research Center, Atomic energy Authority, Cairo, (Egypt); Sanad, W; Flex, H; Abdel-Haleem, A S [Hot Lab. Center, Atomic Energy Authority, Cairo (Egypt); Zohny, E [Physics Department, Faculty of Science, Cairo Univ., Beni-Sweif Branch, Cairo, (Egypt)

    1996-03-01

    The variation in soil total heavy metal contents (horizontally and vertically) in small land area (about one acre) was investigated using neutron activities analysis technique. The background levels found in the sandy soil of Abou-Zabal are also discussed in relation to the findings of other workers. 5 tabs.

  9. Automatic Smoker Detection from Telephone Speech Signals

    DEFF Research Database (Denmark)

    Poorjam, Amir Hossein; Hesaraki, Soheila; Safavi, Saeid

    2017-01-01

    This paper proposes an automatic smoking habit detection from spontaneous telephone speech signals. In this method, each utterance is modeled using i-vector and non-negative factor analysis (NFA) frameworks, which yield low-dimensional representation of utterances by applying factor analysis...... method is evaluated on telephone speech signals of speakers whose smoking habits are known drawn from the National Institute of Standards and Technology (NIST) 2008 and 2010 Speaker Recognition Evaluation databases. Experimental results over 1194 utterances show the effectiveness of the proposed approach...... for the automatic smoking habit detection task....

  10. Upgrade of the automatic analysis system in the TJ-II Thomson Scattering diagnostic: New image recognition classifier and fault condition detection

    NARCIS (Netherlands)

    Makili, L.; Vega, J.; Dormido-Canto, S.; Pastor, I.; Pereira, A.; Farias, G.; Portas, A.; Perez-Risco, D.; Rodriguez-Fernandez, M. C.; Busch, P.

    2010-01-01

    An automatic image classification system based on support vector machines (SVM) has been in operation for years in the TJ-II Thomson Scattering diagnostic. It recognizes five different types of images: CCD camera background, measurement of stray light without plasma or in a collapsed discharge,

  11. ATWS: a reappraisal, part II, evaluation of societal risks due to reactor protection systems failure. Vol. 3. Pwr risk analysis. Phase report

    International Nuclear Information System (INIS)

    Lellouche, G.S.

    1976-08-01

    This document is the third volume of part 2 in a series of studies which will examine the basis for the problem of Anticipated Transients Without Scram (ATWS). The purpose of part 2 is an evaluation of societal risks due to RPS failure based on more current data and methodology than used in WASH-1270. This volume examines and documents the potential contribution to societal risk due to ATWS in the PWR. Volumes 1 and 2 described a similar analysis for the BWR

  12. Interim report for documentation of computer programs for crude oil profile analysis. [Codes, CV, CRULE, CPR, CIMPACT, ENT ADJ, REF VOLS, CIMP PRT, C MODEL, C PRINT

    Energy Technology Data Exchange (ETDEWEB)

    This report is summarizing the findings and documenting the programs developed by Economic Regulatory Administration (ERA) for the Crude Oil Profile Analysis. Based on a variety of estimates and assumptions from different sources, ERA developed a series of computer programs which generate a 24-month projection of the crude oil production in each of the price categories. AVANTE International Systems Corporation is contracted to provide an organized documentation for all of these data base, assumptions, estimation rationale and computing procedures.

  13. Considerations on an automatic computed tomography tube current modulation system

    International Nuclear Information System (INIS)

    Moro, L.; Panizza, D.; D'Ambrosio, D.; Carne, I.

    2013-01-01

    The scope of this study was to evaluate the effects on radiation output and image noise varying the acquisition parameters with an automatic tube current modulation (ATCM) system in computed tomography (CT). Chest CT examinations of an anthropomorphic phantom were acquired using a GE LightSpeed VCT 64-slice tomograph. Acquisitions were performed using different pitch, slice thickness and noise index (NI) values and varying the orientation of the scanned projection radiograph (SPR). The radiation output was determined by the CT dose index (CTDI vol ). Image noise was evaluated measuring the standard deviation of CT numbers in several regions of interest. The radiation output was lower if the SPR was acquired in the anterior-posterior projection. The radiation dose with the posterior-anterior SPR was higher, because the divergence of the X-ray beam magnifies the anatomical structures closest to the tube, especially the spinal column, and this leads the ATCM system to estimate higher patient attenuation values and, therefore, to select higher tube current values. The NI was inversely proportional to the square root of the CTDI vol and, with fixed NI, the CTDI vol increased as the slice thickness decreased. This study suggests some important issues to use the GE ATCM system efficiently. (authors)

  14. Nuclear Reactor RA Safety Report, Vol. 15, Analysis of significant accidents; Izvestaj o sigurnosti nuklearnog reaktora RA, Knjiga 15, Analiza znacajnih akcidenata

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-11-01

    Power excursions of the RA reactor a mathematical model of reactor kinetic behaviour was formulated to describe power and temperature coefficients for both reactor fuel and moderator. Computer code TM-1 was written for analysis of possible reactor accidents. Power excursions caused by uncontrolled control rod removal and heavy water flow into the central vertical experimental channel were analyzed. Accidents caused by fuel elements handling were discussed including possible fuel element damage. Although the probability for uncontrolled radioactive materials release into the environment is very low, this type of accidents are analyzed as well including the impact on the personnel and the environment. A separate chapter describes analysis of the loss of flow accident. Safety analysis covers the possible damage of the outer steel Ra reactor vessel and the water screens which are part of the water biological shield. [Serbo-Croat] Radi analize dinamickog ponasanja reaktora RA u toku ekskurzije snage formulisan je matematicki model koji opisuje promene snage i temperature goriva i moderatora. Za analizu predpostavljenih akcidenata realizovan je racunarski program TM-1. Analizirane su ekskurzije snage usled nekontrolisanog izvlacenja kontrolnih sipki i punjenja centralnog vertikalnog eksperimentalnog kanala teskom vodom. Analizirani su akcidenti pri rukovanju nuklearnim gorivom ukljucujuci ostecenje nuklearnog goriva. Iako je verovatnoca za nekontrolisano rasturanje radioaktivnog materijala mala, analizirani su i ovakvi moguci akcidenti koji mogu uticati kako na osoblje reaktora tako i na okolinu. Posebno poglavlje obuhvata analizu akcidenta u slucaju prestanka cirkulacije primarnog hladioca. Analiza sigurnosti reaktora obuhvata i moguce ostecenje spoljasnjeg celicnog suda reaktora RA kao i vodenih ekrana koji su deo vodenog bioloskog stita.

  15. A Level 1+ Probabilistic Safety Assessment of the high flux Australian reactor. Vol. 2. Appendix C: System analysis models and results

    International Nuclear Information System (INIS)

    1998-01-01

    This section contains the results of the quantitative system/top event analysis. Section C. 1 gives the basic event coding scheme. Section C.2 shows the master frequency file (MFF), which contains the split fraction names, the top events they belong to, the mean values of the uncertainty distribution that is generated by the Monte Carlo quantification in the System Analysis module of RISKMAN, and a brief description of each split fraction. The MFF is organized by the systems modeled, and within each system, the top events associated with the system. Section C.3 contains the fault trees developed for the system/top event models and the RISKMAN reports for each of the system/top event models. The reports are organized under the following system headings: Compressed/Service Air Supply (AIR); Containment Isolation System (CIS); Heavy Water Cooling System (D20); Emergency Core Cooling System (ECCS; Electric Power System (EPS); Light Water Cooling system (H20); Helium Gas System (HE); Mains Water System (MW); Miscellaneous Top Events (MISC); Operator Actions (OPER) Reactor Protection System (RPS); Space Conditioner System (SCS); Condition/Status Switch (SWITCH); RCB Ventilation System (VENT); No. 1 Storage Block Cooling System (SB)

  16. A Level 1+ Probabilistic Safety Assessment of the high flux Australian reactor. Vol. 2. Appendix C: System analysis models and results

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    This section contains the results of the quantitative system/top event analysis. Section C. 1 gives the basic event coding scheme. Section C.2 shows the master frequency file (MFF), which contains the split fraction names, the top events they belong to, the mean values of the uncertainty distribution that is generated by the Monte Carlo quantification in the System Analysis module of RISKMAN, and a brief description of each split fraction. The MFF is organized by the systems modeled, and within each system, the top events associated with the system. Section C.3 contains the fault trees developed for the system/top event models and the RISKMAN reports for each of the system/top event models. The reports are organized under the following system headings: Compressed/Service Air Supply (AIR); Containment Isolation System (CIS); Heavy Water Cooling System (D20); Emergency Core Cooling System (ECCS); Electric Power System (EPS); Light Water Cooling system (H20); Helium Gas System (HE); Mains Water System (MW); Miscellaneous Top Events (MISC); Operator Actions (OPER) Reactor Protection System (RPS); Space Conditioner System (SCS); Condition/Status Switch (SWITCH); RCB Ventilation System (VENT); No. 1 Storage Block Cooling System (SB)

  17. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    International Nuclear Information System (INIS)

    Pescarini, M.; Orsi, R.; Martinelli, T.

    2003-01-01

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  18. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  19. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  20. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  1. Automatic segmentation of the heart in radiotherapy for breast cancer

    DEFF Research Database (Denmark)

    Laugaard Lorenzen, Ebbe; Ewertz, Marianne; Brink, Carsten

    2014-01-01

    Background. The aim of this study was to evaluate two fully automatic segmentation methods in comparison with manual delineations for their use in delineating the heart on planning computed tomography (CT) used in radiotherapy for breast cancer. Material and methods. Automatic delineation of heart...... in 15 breast cancer patients was performed by two different automatic delineation systems. Analysis of accuracy and precision of the differences between manual and automatic delineations were evaluated on volume, mean dose, maximum dose and spatial distance differences. Two sets of manual delineations...

  2. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  3. samaru-main-vol 11 2011

    African Journals Online (AJOL)

    Library _info_Sc_ 1

    2004-07-11

    Jul 11, 2004 ... Samaru Journal of Information Studies Vol. 11 (1 & 2)2011 ... It is therefore paramount that the library ... source of primary and up to date information, both students and ... entertainment, arts, fashion, law, economy, medicine,.

  4. AJESMS_ Vol 8 2010 August 15 2011

    African Journals Online (AJOL)

    Prof. Mereku

    2010-08-15

    Aug 15, 2010 ... African Journal of Educational Studies in Mathematics and Sciences Vol. 8, 2010 ... environs, a mining area in Wassa West District of Ghana .... This finding is based on autopsy data and on reports showing that blood levels in.

  5. 12-EEASA-Vol 22.indd

    African Journals Online (AJOL)

    jenny

    2006-02-22

    Feb 22, 2006 ... Southern African Journal of Environmental Education, Vol. .... To this end, a planning conference on environmental .... mathematics, physics, chemistry, biology, geography, agriculture, home economics and human and.

  6. NJB VOL.34 Original Lower.cdr

    African Journals Online (AJOL)

    important sources of renewable biological ... The plantain (Musa paradisiaca) and banana chemicals (Nowak et ... fermentable sugars available for bio-ethanol ..... peels. Afr. J. Biochem. Res. 9(9): 104 - 109. 70. Osho et al./ Nig. J. Biotech. Vol.

  7. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  8. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  9. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  10. Analysis of the High-Frequency Content in Human QRS Complexes by the Continuous Wavelet Transform: An Automatized Analysis for the Prediction of Sudden Cardiac Death.

    Science.gov (United States)

    García Iglesias, Daniel; Roqueñi Gutiérrez, Nieves; De Cos, Francisco Javier; Calvo, David

    2018-02-12

    Fragmentation and delayed potentials in the QRS signal of patients have been postulated as risk markers for Sudden Cardiac Death (SCD). The analysis of the high-frequency spectral content may be useful for quantification. Forty-two consecutive patients with prior history of SCD or malignant arrhythmias (patients) where compared with 120 healthy individuals (controls). The QRS complexes were extracted with a modified Pan-Tompkins algorithm and processed with the Continuous Wavelet Transform to analyze the high-frequency content (85-130 Hz). Overall, the power of the high-frequency content was higher in patients compared with controls (170.9 vs. 47.3 10³nV²Hz -1 ; p = 0.007), with a prolonged time to reach the maximal power (68.9 vs. 64.8 ms; p = 0.002). An analysis of the signal intensity (instantaneous average of cumulative power), revealed a distinct function between patients and controls. The total intensity was higher in patients compared with controls (137.1 vs. 39 10³nV²Hz -1 s -1 ; p = 0.001) and the time to reach the maximal intensity was also prolonged (88.7 vs. 82.1 ms; p content of the QRS complexes was distinct between patients at risk of SCD and healthy controls. The wavelet transform is an efficient tool for spectral analysis of the QRS complexes that may contribute to stratification of risk.

  11. Upgrade of the automatic analysis system in the TJ-II Thomson Scattering diagnostic: New image recognition classifier and fault condition detection

    International Nuclear Information System (INIS)

    Makili, L.; Vega, J.; Dormido-Canto, S.; Pastor, I.; Pereira, A.; Farias, G.; Portas, A.; Perez-Risco, D.; Rodriguez-Fernandez, M.C.; Busch, P.

    2010-01-01

    An automatic image classification system based on support vector machines (SVM) has been in operation for years in the TJ-II Thomson Scattering diagnostic. It recognizes five different types of images: CCD camera background, measurement of stray light without plasma or in a collapsed discharge, image during ECH phase, image during NBI phase and image after reaching the cut off density during ECH heating. Each kind of image implies the execution of different application software. Due to the fact that the recognition system is based on a learning system and major modifications have been carried out in both the diagnostic (optics) and TJ-II plasmas (injected power), the classifier model is no longer valid. A new SVM model has been developed with the current conditions. Also, specific error conditions in the data acquisition process can automatically be detected and managed now. The recovering process has been automated, thereby avoiding the loss of data in ensuing discharges.

  12. Upgrade of the automatic analysis system in the TJ-II Thomson Scattering diagnostic: New image recognition classifier and fault condition detection

    Energy Technology Data Exchange (ETDEWEB)

    Makili, L. [Dpto. Informatica y Automatica - UNED, Madrid (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Dormido-Canto, S., E-mail: sebas@dia.uned.e [Dpto. Informatica y Automatica - UNED, Madrid (Spain); Pastor, I.; Pereira, A. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Farias, G. [Dpto. Informatica y Automatica - UNED, Madrid (Spain); Portas, A.; Perez-Risco, D.; Rodriguez-Fernandez, M.C. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Busch, P. [FOM Institut voor PlasmaFysica Rijnhuizen, Nieuwegein (Netherlands)

    2010-07-15

    An automatic image classification system based on support vector machines (SVM) has been in operation for years in the TJ-II Thomson Scattering diagnostic. It recognizes five different types of images: CCD camera background, measurement of stray light without plasma or in a collapsed discharge, image during ECH phase, image during NBI phase and image after reaching the cut off density during ECH heating. Each kind of image implies the execution of different application software. Due to the fact that the recognition system is based on a learning system and major modifications have been carried out in both the diagnostic (optics) and TJ-II plasmas (injected power), the classifier model is no longer valid. A new SVM model has been developed with the current conditions. Also, specific error conditions in the data acquisition process can automatically be detected and managed now. The recovering process has been automated, thereby avoiding the loss of data in ensuing discharges.

  13. TU-FG-201-03: Automatic Pre-Delivery Verification Using Statistical Analysis of Consistencies in Treatment Plan Parameters by the Treatment Site and Modality

    International Nuclear Information System (INIS)

    Liu, S; Wu, Y; Chang, X; Li, H; Yang, D

    2016-01-01

    Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans of the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the

  14. TU-FG-201-03: Automatic Pre-Delivery Verification Using Statistical Analysis of Consistencies in Treatment Plan Parameters by the Treatment Site and Modality

    Energy Technology Data Exchange (ETDEWEB)

    Liu, S; Wu, Y; Chang, X; Li, H; Yang, D [Washington University School of Medicine, St. Louis, MO (United States)

    2016-06-15

    Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans of the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the

  15. Automatic Radiometric Normalization of Multitemporal Satellite Imagery

    DEFF Research Database (Denmark)

    Canty, Morton J.; Nielsen, Allan Aasbjerg; Schmidt, Michael

    2004-01-01

    with normalization using orthogonal regression. The procedure is applied to Landsat TM images over Nevada, Landsat ETM+ images over Morocco, and SPOT HRV images over Kenya. Results from this new automatic, combined MAD/orthogonal regression method, based on statistical analysis of test pixels not used in the actual...

  16. Automatic Estimation of Movement Statistics of People

    DEFF Research Database (Denmark)

    Ægidiussen Jensen, Thomas; Rasmussen, Henrik Anker; Moeslund, Thomas B.

    2012-01-01

    Automatic analysis of how people move about in a particular environment has a number of potential applications. However, no system has so far been able to do detection and tracking robustly. Instead, trajectories are often broken into tracklets. The key idea behind this paper is based around...

  17. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J. S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J. P. K.; Geertzen, J. H. B.

    2004-01-01

    This paper describes a new automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitted

  18. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients,

  19. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  20. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...